DNS Security Is A Necessary Key To Privacy

DNS Security Is Not Too Expensive
Comparison of DNS Services


Yesterday, I wrote about how Mozilla is updating Firefox to improve DNS security. But my conclusion was that Mozilla may have designed a system with a single point of failure. In fairness, this assessment was far too simplistic. Today, I want to amplify on my thoughts.

The Firefox implementation assumes something that is probably false. It assumes that most people are using Firefox. It also assumes that all Firefox users can choose an appropriate resolver/referrer to meet their specific needs. I would argue that the first assumption is patently wrong while the second assumption is altogether naive. As noted yesterday, I would have much preferred that Mozilla be more transparent about their proposed changes. Also, rather than assume that the code is open and thus reviewed, Mozilla could have asked for more extensive input. [Note: I suspect that Mozilla was transparent with a somewhat limited community. I just wish that their design had been explicitly shared.]

The real problem that I have with their approach is that it is a Firefox-only solution. No, I don’t expect Mozilla to solve this problem for their competitors. But most organizations need a broader solution that will meet everyone’s needs. Enter dnsmasq. In my case, I have implemented DNS over HTTPS (DoH) as my mechanism to improve DNS security. 

I am quite fortunate: I am using a DNS server that was just updated. The Pi-hole team has just released Pi-Hole 4.0. And a DNS service provider (Cloudflare) has released an AMD and ARM DNS-over-HTTPS implementation. [Note: Their solution is in binary form. So I will add my voice to the chorus asking for the software to be published under a free/open license. Until that happens, I’m testing the system to see how it performs.]

So what am I seeing thus far?

After switching to cloudflared on Pi-hole 4.0, I ran some benchmarks. And as expected, there was quite a bit more activity on my DNS server. But the overall DNS response time (i.e., the server at 10.42.222.22) was quite acceptable. I want to get about a week’s worth of data. But at this moment, I am very hopeful that the software will maintain acceptable performance levels.

So what should you do? If you’re bold, then use a test system and try it out for yourself. Or keep close tabs on folks who are testing this technology. At the current time, I am thrilled to close down yet another vector of surveillance.
 

TRR = Totally Risky Referrer

Totally-Risky-Resolver
TRR Is Anything But Trusted

When Homer Simpson says “doh”, you know that something stupid is about to happen. Unfortunately, I believe that the same thing is true about the upcoming Firefox feature called DNS over HTTPS (i.e., DOH). Developers at Firefox noted a real problem: DNS queries aren’t secure. This has been axiomatic for years. That’s why DNS developers created DNSSEC. But DNSSEC is taking forever to roll out. Consequently, the Firefox developers baked Trusted Recursive Resolver (TRR) into Firefox 61. [Note: TRR has been available since Firefox 60. But TRR will move from an experiment to a reality as Firefox 61 rolls out across the Internet.]

Background

One of the key design points of TRR is  the encapsulation of data in a secure transport mechanism. Theoretically, this will limit man-in-the-middle attacks that could compromise your browsing history (or redirect your browser altogether). Of course, theory is not always reality. Yes, SSL/TLS is more secure than plain text. But it is widely used. So it is burdened by the need to retain backward-compatibility. Nevertheless, it is more secure than plain text. And security conscious consumers can implement TRR even if their local DNS provider doesn’t currently offer DNSSEC.

Risk

So why is TRR so risky? That’s simple: Mozilla is implementing TRR with a single recommended resolver: Cloudflare. I don’t think that anyone has an axe to grind with Cloudflare. From all that I have read, Cloudflare has never used customer data exclusively for its own benefit. That’s not true for Google, or OpenDNS, or a lot of other DNS providers. Of course, Cloudflare is a relative newcomer. So their track record is limited. But the real issue is that Mozilla has designed a system with a single point of failure – and a single choke point for logging and control.

Mitigation

Fortunately, Mozilla has enabled changing the TRR mode and TRR URI. Unfortunately, it is currently managed only through the about:config interface. That’s fine for a technician. But it is a dreadful method for end users. I am hopeful that Mozilla will provide a better interface for users. And I certainly hope that it is implemented on an “opt-in” basis. If they don’t, then folks who use their own DNS (e.g., every Pi-hole user) or folks who specify a different public provider than Cloudflare (e.g., Google, OpenDNS, DNS.Watch, etc) will be forced to “touch” every workstation.

Bottom Line:

Is Firefox acting badly? Probably not. After all, they are trying to close a huge DNS hole that infrastructure providers have yet to close (i.e., DNSSEC). Nonetheless, their approach is ham-handed. Mozilla needs to be transparent with the why’s and when’s – and they need to trust their users to “do the right thing.”
 

The Hack-proof Conceit

John-McAfee-Invites-Hack-Attack
John McAfee Invites Hack Attack

John McAfee and Bitfi offered a bounty to anyone who could hack a Bitfi wallet. After a very short time, John and Bitfi raised the bounty to $250,000.  As of two days ago, a hacker has claimed that bounty. Bitfi (and John) are saying that this was not a valid hack of their wallet. So there is a tremendous disagreement about whether John (and Bitfi) will pay the bounty.

I do think that the hack was successful. But whether I believe that the hack occurred or not is irrelevant. What I do believe is that no system is impenetrable – or “hack-proof”.  Over the past few decades, I have seen every secure system successfully attacked (and usually overwhelmed) by a determined hacking entity. These successes come in many forms. For some systems, hackers have leveraged a software vulnerability. For other systems, attackers have leveraged a vulnerable person. If you don’t believe this, then look no further than the DNC in the 2016 election cycle.

I would say that anyone who boasts in their impenetrability is merely inviting an attack. This axiom should remind us of a few important things.

1. Don’t boast! Pride is a deadly sin.

2. If you can be inconspicuous, then strive to become (and remain) inconspicuous. If you are not a target of a determined person or group, then don’t offer to become a target. For companies like Bitfi, the organization should not make outlandish claims. For you, I recommend that you not boast (on social media) about the things that you own. And don’t tell people when you are leaving your house for a splendid vacation. And for John McAfee, I say that he has exceeded his “best used by” date. Therefore, we need to dismiss him.

3. If you are part of a large group of targets, then be better (and more secure) than the other members of the group. For example, if you have online accounts, then use strong passwords. If you use strong passwords, then use two-factor authentication. If you use two-factor authentication, start using a virtual private network that will obscure your identity.

4. Remember that if you are a discrete target, then a determined hacker will probably defeat you – unless you are an equally skilled hacker. Therefore, make sure that you have a plan for the time when you are hacked. This includes backups. But it also includes a press statement about what you are doing (and will do) to minimize risk to your customers. After all, they are trusting you to protect them.
 

Consolidating Micro Data Centers

Cloud-based-Microservices
Cloud-based Microservices

Cloud computing is an information technology (IT) paradigm that enables ubiquitous access to shared pools of configurable system resources and higher-level services that can be rapidly provisioned with minimal management effort, often over the Internet.”

Using this definition, the key elements of cloud computing are as follows:

  • Network access
  • Shared groups of systems and services
  • Rapid (and dynamic) provisioning
  • Minimal  management

Nothing in this definition speaks to the size of the “data center” which houses these systems and services. Most of us probably think of Amazon, or Google, or Microsoft when we think of cloud services. But it need not be a multi-million dollar investment for it to be a part of cloud computing.

Data Center Consolidation

This past weekend, we closed one of our data centers. Specifically, we shut down the facility in Waldo, Missouri. This “data center” was a collection of systems and services. It hosted the web site, the file servers, and one of our DNS servers. But these weren’t housed in a vast data center. The services were located in a room within a  residential property. For the past four months, we ran this site remotely. And this past weekend, we consolidated all the Waldo services at our Elgin facility.

Like most moves, there was a plan. And the plan was fluid enough to deal with the challenges that arose. And as happens with most consolidations, some spare gear became available. We reclaimed the DNS server (a Raspberry Pi). And we re-purposed the premise router as a test platform at our Elgin site.

Since this site was both business and residential, we had to re-architect the storage infrastructure to accommodate multiple (and dissimilar) use cases. We also moved key data from local storage on the servers to the consolidated storage farm. 

Once cleared out, we returned the property back to the landlord.

Service Consolidation

As noted, we consolidated all of the file servers into a single storage farm. But we did need to migrate some of the data from the servers and onto the new storage. Once we migrated the data, we consolidated the streaming servers. The overall experience for our streaming customers will become much simpler.

Hardware Re-use

With the release of one of our routers, we are now able to put a test bed together. That test bed will run DD-WRT software. The process of converting the Netgear infrastructure to DD-WRT was quite tedious. It took four (4) different attempts to reset the old hardware before we could load the new software. This wasn’t anticipated. And it took us beyond the anticipated change window. Fortunately, we kept our customers informed and we were able to amend customer expectations.

Once deployed, the new network build will provide VPN services to all clients. At the same time, we will be turning up DNSSEC across the company. Finally, we will be enabling network-wide QOS and multi-casting. In short, the spare gear has given us the chance to improve our network and our ability to deliver new services.

The Rest of the Story

All of this sounds like a well-oiled plan. And it did go without any real incidents. But the scale of the effort was much smaller than you might expect. The site in Waldo was a room in a rental. The servers were a desktop, a couple of laptops, a NAS box, a cable modem, a Netgear R8000 X6 router, a Raspberry Pi, and a variety of streaming devices (like a TV, a few Chromecast devices, and the mobile phones associated with the users (i.e., members of my family.

So why would I represent this as a “data center” move? That is easy: when you move connected devices across a network (or across the country), you still have to plan for the move. More importantly, cloud services (either at the edge or within the confines of a traditional data center) must be manged as if the customer depends upon the services. And to be fair, sometimes  our families are even more stringent about loss-of-service issues than are our customers.
 
 
 

Two-Factor Authentication (2FA) Goes Mainstream

Google-Does-2FA
Google Enters 2FA Token Market

2FA for the masses…

Two-factor authentication (a.k.a., 2FA) has been around for decades. I first used early versions of this technology back in the mid-nineties. When I first used it, I authenticated to secure servers using an RSA token. Since then, I’ve used numerous 2FA tools for numerous work assignments.  

Over the years, the token has changed

My first token was a card. It had a small LCD screen that displayed an access code. After a fixed time had elapsed, a new code was generated. From this beginning, I migrated to a standalone (i.e., disconnected) token.  Since the nineties, I’ve had dozens of RSA tokens. But when I was with the Department of Defense, I used a Common Access Card (or CAC) to log into most systems. And in the past few years, I’ve used mobile phone apps that would display time-based access codes.

A few years ago, I decide that I wanted to enable multi-factor authentication on every public service that would support it.  And I wanted to make sure that I used a token that I could carry with me. I chose the Yubikey token. I can use that token by inserting it into a USB connection. I can also use near-field communications (NFC) to tap the token on my phone. Once authenticated on the device, I got the traditional rotating code that I could use on almost any service.

Google finally gets into the 2FA token market

Google has supported multi-factor authentication for a number of years. But until today, Google never produced a token. Their new product – branded the Titan key – will provide 2FA for cloud services. And let’s be clear about this: the Titan key is nothing new. However, it is coming from Google. And Google WILL support this device. More importantly, other service providers will support this device. Most importantly, since it is coming from Google, consumers will purchase the product in dizzying numbers.

Bottom line:

Google just put their enormous stamp on 2FA for consumers. If you’re not yet using two-factor authentication (either at home or at work), then Google has now put you on notice.

Security Is Top Concern For Sm(all) Businesses

Security-Top-Concern
Security a top concern for SMB leaders

Do you own a small business? Are you concerned about security? Do you care about your customers’ privacy?

If you say “yes” to these questions, then you are in good company. In a recent study of small and medium-sized businesses (conducted by Kaseya and summarized at BetaNews), business owners stated that security was a key business concern for them. Fifty-four (54) percent believed that security was the most important issue which they faced.

Eighty-six (86) percent of these same survey respondents experienced network availability issues. And forty-five (45) percent of them experienced network service interruptions that lasted longer than five (5) minutes.  Yet despite these risks, over seventy (70) percent of small business have chosen to use “external” services. Apparently, businesses – regardless of size – are accepting these risks. Indeed, many executives think that these risks are the very ‘table stakes’ that they must pay to stay “connected” to their customers.

What services do other companies purchase?

According to Mike Puglia, “Microsoft Office 365 leads the way as the most deployed solution (72 percent) followed by Dropbox (29 percent) and Salesforce and Google Suite both coming in with 17 percent.” Undoubtedly, every organization (with the possible exception of the telecos and the government) gets its connectivity from a service provider. And almost every company gets key services (like domain naming, site hosting, and email connections) from an external provider. Consequently, very few businesses can compete unless they are connected to the Internet. And very few businesses can make these connections on their own. Bottom Line: You must be connected to compete. And whenever you connect your resources to those of others, you are accepting the risk of exploitation.

What should you do to compete – and survive?

If you are a small or medium-sized business, then you need to use the same services that the big “enterprise” corporations use. But you can’t afford to maintain an entire department of IT professionals. So you need “corporate-sized” professional services at an affordable price. If you want scalable solutions that are affordably priced, then you should contact Lobo Strategies. We can help you walk this tightrope.

5G: Qualcomm Takes One Step Closer

5G Antenna Modules
5G wireless is one step closer to reality. AT&T and Verizon have made huge investments in millimeter-wave (mmWave) radio spectrum (e.g., 28GHz and 39GHz). Sprint and T-Mobile have placed their bets on existing spectrum below 6GHz (i.e., “sub-6”) radio spectrum. A huge step towards the 5G aspirations of these two camps was made yesterday when Qualcomm announced its mmWave and sub-6 antenna modules.

Now that these modules are formally available, handset designers and producers will accelerate their movement towards these pivotal 5G technologies.   Don’t expect everything to shift to 5G by the end of this year. But you should expect that handsets being developed for launch next year will begin to feature 5G capabilities.

In the meantime, expect to see niche offerings. Based upon published plans by the carriers/operators, the first products featuring 5G will probably take the form of “pucks” that will provide a “semi-fixed” wireless connection. What does “semi-fixed” mean? Simple. You won’t carry these things in your pocket; they won’t be a mobile phone. You will set them up so other devices can connect to them. I expect to see these fixed wireless solutions to start to show up in households or in the briefcases of road warriors.

Will these “hotspot” use cases dominate the residential and/or mobile office space? They will not dominate any market in 2018. I do expect to see the early adopter crowd will jump on board in the first half of 2019. But I don’t expect widespread adoption (beyond 10%) until 2019. Nevertheless, these un-tethered back haul connections will be a substantive challenge to the cable operators. Specifically, the cord-cutters (who want wireless connectivity without content bundling) will jump on these connectivity devises – assuming that the operators price them appropriately.

In the final analysis, yesterday’s announcement by Qualcomm outlines the future. And the future will be 5G. But one question remains: which team will win?

Lorin’s Prediction: The millimeter wave crowd will win in the fixed wireless challenge to cable companies. And since the mmWave build-out is starting in dense urban settings, I think that AT&T and Verizon will clean up in urban centers. For the rest of the markets, the sub-6 enthusiasts will garner more market share – until the 5G tower build-out is complete. And the sub-6 crowd will register early wins in the mobile wireless use cases.  But the real fun will begin when mmWave and sub-6 antenna modules are in every device – whether mobile or fixed. Then we will see who secures the future markets.

Password Re-use: Physician, Heal Thyself!

password re-use abuse
Password Re-use Is Abuse

A survey of professionals at the Infosecurity Conference 2018 in London has revealed that 45% of their attendees are guilty of password re-use across multiple accounts. And depending upon the source that you cite, up to 73% of consumers are guilty of the sin of password re-use. If you’re part of these groups, then you need to move out of that neighborhood. And you need to do so as quickly as possible. But how do you do that?

There are really only two methods: memorize unique passwords for each account, or store unique passwords for each account in a secure place. For me, I have over one hundred and fifty accounts. So memorizing complex random passwords for that many accounts is impractical.  And writing these down in an unsecured file or on a piece of paper is truly unacceptable. Does anyone remember the scene in “Wargames” when Matthew Broderick’s character opens the office administrator’s drawer and sees the password list? 

So I am part of the 8% that use a password manager to create and store complex passwords for every account. As of this moment, I don’t remember any of my passwords – except the password to my password safe. Every password I use is unique. And my password manager encrypts every entry ensure its security. If you are looking at password managers, then the two best tools (both of which I’ve used) are LastPass and 1Password. I prefer LastPass because it has tools to help create new passwords on (or before) the date when each account expires. And there is a testing tool that helps you to ensure that you don’t accidentally re-use a password.

Whatever you do, it’s time to get on with the business of properly managing passwords. It is the best “first step” that you can take to secure your identity.
 

Gibbs’ “Home Automation” Boat

Pardon the coy title. But I love “NCIS”. And some days, I feel a lot like Gibbs: I have a pet project in the basement. And whenever I get bored or frustrated, I work on my project. But as of now, I think that my home automation project is finally sea-worthy.

My little Pi is a beast. I am running a long list of software on this device. I’m running Home Assistant with the following add-ons: DuckDNS, Let’s Encrypt, Mosquitto, Node-RED, Samba, and SSH. With this combination, I can monitor assets within the home. I can determine whether my wife and I are in the home or outside of the home. I can build automated tasks based upon data collected within the home. I can manage the assets and the compute infrastructure in the home. And I can secure it against exploitation by ‘bad actors’.

And now, I’ve finally gotten around to configuring the data collection and graphing infrastructure. The package of tools that I am using for this includes Grafana and InfluxDB. After installing the components, I got about the work of configuring the software.

InfluxDB is a time-series data repository. It is designed much like a NoSQL tool; data is written in series but isn’t updated after it is originally written. Later, the data is read serially and used in graphing and/or statistical studies. Fortunately, I was able to configure InfluxDB with very little incident. I think that years of econometric studies made this part relatively simple to implement.

Once done with the database, I turned my attention to Grafana itself. And it was very difficult to grok this tool. First, I ran into quite a bit of difficulty installing needed plugins. After poring over the logs (and consulting written guides on the Internet), I found that the “behind the curtain” instance running in a container was having difficulty downloading the ‘plugin’ components on the fly.

While scratching my head, I saw a quick popup about my dynamic address being updated. That’s when the light came on. For whatever reason, I had been having trouble with a sporadic inability to log into my system. The symptoms were that the login would just wait, and wait, and wait. I finally remembered that some applications really dislike running inside a VPN tunnel. And worse still, I wondered if the IP address recorded in DNS reflected a potentially changing DNS entry.

So I disconnected from my VPN. That’s when things just started to work. It was quite odd, though. I could finally add the plugins. But I had changed the network on my Windows system – and not the network on the Pi. So there has to be something flowing through the browser. I’ll have to dig into that. But the problem had been solved.

I also found that I needed to update the DNS on my little server. Simply put, I had been using the Pi-hole (an ad blocking DNS server) to fulfill the DNS requests for the Home Assistant Pi system. I suspected that certain key DNS requests returned with null results. Therefore, I needed to clean up the DNS config on the Home Assistant Pi.

Once both of these tasks (i.e., the VPN and the Pi-hole DNS) were resolved, the plugins started to install. So my Grafana installation could proceed.

And then I hit the learning curve of Grafana itself.

Grafana is a very cool tool. But its user interface is not very intuitive. It took a few hours to figure out just how to add variables and select the right graphing interval before real data started to emerge. But once I learned these little tricks, the graphs became easy. Last night, I began the quest to graph all of the data that I could graph. This compulsion is a learned experience; I spent many years being driven by capacity and performance data. And I wasn’t harnessing the data that is coming from my home sensors. So I am now inspired to build all sorts of data models and graphs. [Note: I really love it when my past experiences can inform my current and future activities.]

All in all, I’ve spent a few hundred hours over a few months on this home automation project. And I have learned so much about home automation, container technologies, and web security at the edge of the network. Now I’m left with one nagging thought – and an irresistible question: How does anyone expect the average homeowner to know these things. Moreover, how can we expect consumers to care enough to learn these things? Most people want the “iPhone experience” where they can spend a lot and have someone else do the integration for them.

So which are you? Are you a maker/builder/integrator? Or are you a buyer?

Trusting Technology While Distrusting Humans

Below is a comment that I posted to an article by Mr. Kai Stinchombe. Kai’s article is a well-written critique against the universal applicability of blockchain. His article is well worth the investment of time it will take to completely read it.

———-

Dear Mr. Stinchcombe,

Thank you for your cogent assessment of the applicability of blockchain technology. Since its introduction, too many people have looked at Blockchain in the same way that some physicists looked at cold fusion. To some, cold fusion was a masterpiece of ingenuity. To others, it was pure poppycock.

But why was cold fusion so attractive?

People want cheap energy. And people don’t want to be victims of broad geopolitical struggles. They saw the promise of technology demonstrated in the fifties, sixties, and seventies in the form of a nuclear arms race. But efforts at harnessing the power of fusion for peaceful purposes were fraught with the failure of unfulfilled promises. So people wanted “cold” fusion to work because no one had solved the engineering challenges associated with traditional fusion approaches. In short, people wanted limitless power on their kitchen countertop.

The same thing is true about blockchain. People want independence. And they want to get the government and corporations out of their living rooms. More specifically, they want the “bad actors” out of their back pockets and purses. People despise banks and they despise politicians. So they desperately want a solution that dis-empowers those whom they distrust. And they are willing to trust technology instead.

But the fact that you do not trust the bankers, the corporations, or the hacker elite does not mean that you should blindly trust in the promise of an “indifferent” technology.

First, technology is not something that is amoral. We view science as something immutable and ‘scientific’. But it is never that simple. Technology is built by people. And it is influenced by the experiences of those people.To see a vivid demonstration of this, we need only consider the WOPR (in “Wargames”) which was influenced by the pacific nature of its creator. Or we could consider Ulton (in the Marvel Cinematic Universe) who was every bit as much of a control freak as its creator.

Second, technology must be built (and maintained) by people that have money. So every technology is controlled by the masters who built it and the masters who operate it. If this were not the case, then why are intellectual property rights being enshrined in perpetuity. And we need look no further than Apple to see an example. Steve Wozniak was the builder with the idealistic vision. And Steve Jobs was the operator with the drive to control the creation and to destroy its competitors.

So if the real problem is that people don’t like others to steer their lives, then that is the problem that they should address. For some, that means going off the grid. For them, it is easier to hide from the wickedness of the world rather than live within it. For others, it means becoming one of the people who design the collars, leashes, and chains that will keep us in check. These are the inventors who wish to think abstractly and not consider the consequences of the systems that they will build. And for still others, it means that they must wield the whips, the chains, and the leashes that are restraining the people. For this final group, they seek to take advantage of the systems of control. None of these groups is inherently more right than the others.

But what does this have to do with your thesis?

Basically, I think that solutions must address the real-world problems that prompted their necessity. In the case of blockchain, it was created because people distrust bankers and governments. And as Dan Jeffries notes, the blockchain is headed for even more control by the banks and by the government. So its original reason for existence is propelling it towards the very hands that its creators had sought to deplenish.

Mr. Stinchombe, your premise is right. Blockchain is not a panacea for anything. It won’t solve our problems with the monetary system or with accounting practices. Indeed, it may even exacerbate them. Neither will blockchain be the means of creating an independent and trustworthy voting system.

Let’s solve the problems that are before us. If our problem is a lack of trust, then let’s not assume that a piece of technology is the ‘trustworthy’ alternative. Instead, let’s work on the problems of the heart that are creating such fundamental distrust in our foundational institutions.

For example, if you don’t trust politicians, don’t blame the voting machines or the Russians. Instead, let’s leverage competition as the means of investing in alternatives. If you hate both Democrats and Republicans and you consider all of them to be tools of “the system”, then work to create a new and competitive party. Become the kind of leader that you want to follow. Write about your thoughts. Share your thoughts. And work with those who share your thoughts. Or you could become part of the flawed system itself. You can try to alter it from within. But one thing is clear: there are risks with such an approach. In particular, you could be lured into complacency and acquiesce to the very manipulation that you are seeking to change.

But whether you build a competitive system or you seek to transform the current system from within, one thing is clear: technology will never change problems with the human heart. Our solution is not as easy as building a robot that will solve the challenges which we sought to avoid. Let’s do the hard lifting of trusting others — even if they invalidate our trust.