Household Certificates: The New Economic Reality

Household Certificates Everywhere
Certificate Market Share

How many of you remember your first economics class? For most, it was a macroeconomics survey course that met a behavioral and social sciences requirement. But whether you took an econ class, became an econ major, or you are just a participating member of the economy, you have likely heard about the “law” of supply and demand. [Actually, there is no “law”, per se. But there are real outcomes that are necessary results of the actions that we take.] In a market where resources are limited, increased demand for a good (or service) will almost always result in increased prices. At the same time, an increased supply of that good (or service) will drive the price lower.  And when that price declines, the demand for that good (or service) will probably increase. Simple, right? The same thing is true for the car market, the computer market, and the market for household certificates (i.e., secure services in the home).

The Security Market

Most people have not yet implemented household certificates (or other security mechanisms) because the “cost” was way too high. Historically, the exorbitant cost for a good home security system meant that only those with disposable income could afford these devices (and services). Some people bypassed the initial outlay by building it into the price of a new home. That way, the costs could be distributed over fifteen (or thirty) years. But either way, the number of willing customers remained small.

The same reality is true for digital security and household certificates. You might have heard about two-factor authentication. But you may not have the skills – nor do you have the money – to implement a digitally secure household. So you left those kinds of security steps for others to implement. Basically, you want digital security, but you can’t afford to install or support it.

Household Certificates: Mandatory…and Cheap

The times are changing. As any technology is introduced, early adopters pay excessive amounts of money to have a tool that is cool. If this weren’t the case, then how could anyone justify a $1,200 iPhone?  Yes, the iPhone is cool. But you can get something similar for $800-$900. And if you bypass just a couple of features, you can get a good phone for between $300 and $500. [This is exactly what Dell did when it disrupted the desktop computer market that was previously owned by Apple and IBM.

In security circles, the cost of security certificates (and the learning curve associated with their use) has meant that corporations would be the only users of this kind of technology. But just as the iPhone spurred cheaper competitors, the Internet security industry is also beginning to get its price disruption. You no longer have to go to the “big players” to install household hubs. You can build them yourself. And you don’t have to get certificates from the same places as the big corporations: you can get workable certificates for free from Let’s Encrypt.

You may be asking yourself why you would need security certificates. And if you don’t have any services running at home, then you may not need certificates. But if you have a Plex server, or if you use home automation, or if you have mainstream home security tools (from folks like SimpliSafe, or August, or Blink, or Netgear), then you really do need household certificates.

Why are household certificates important? Because when you connect to services at home, you will want to make sure that it is your home services that are responding to you. Without certificates, there is a real risk that someone will step in between you and your household services. Hackers do this so they can impersonate your servers – and collect valuable data directly from you.  [In security parlance, this is called a man-in-the-middle attack.] By having household certificates, your systems can present secure ‘credentials’ to ensure that the server is who it reports itself to be.

Secure Authentication

Similarly, you may want to ensure that anyone trying to log into your household must present a trusted token to access the treasures inside your house. [Think of this as the digital equivalent of a front door key.]  This can be done with strong passwords. But it can also be done with digital certificates. And almost every implementation of two-factor authentication uses encryption (and certificates) to validate a user’s identity. Without certificates, the only thing that lies between your treasures and digital assailants is your password.  [Let’s hope that your password is both strong and totally unique.]

And with Google’s recent announcement that they will be producing security tokens (i.e., the Google Titan key), the authentication market is finally being commoditized. Prices will no longer be set by only one or two vendors (like RSA or Yubico). And I am sure that other vendors will take advantage of the reduced costs that will be a necessary result of increased key production (needed to meet the Google demand).

Let’s Encrypt: Supply-side Answers

According to Wikipedia, ” The Let’s Encrypt project was started in 2012 by two Mozilla employees, Josh Aas and Eric Rescorla, together with Peter Eckersley at the Electronic Frontier Foundation and J. Alex Halderman at the University of Michigan.” The first public product launch was on April 12, 2016. At the time of launch, Let’s Encrypt entered a market that was dominated by Symantec, GoDaddy, and Comodo

The Let’s Encrypt price point is simple: zero cost certificates. The catch is that these certificates are only good for three months. But with a little scripting (and a few tools from the EFF), the certificate refresh process is almost effortless. And Let’s Encrypt is being built into most household management systems. So with no production costs and with decreasing skill requirements, household certificates are becoming impossible to ignore.

Bottom Line

If you have a little technical know-how, then now is the time to start using Let’s Encrypt on your household servers. And if you aren’t technically savvy, then expect the hardware and software providers to start bundling this security technology into their products. For them, the cost is limited. And adding real security features can only improve customer satisfaction – if it is completely friction-less.

Alexa Dominance: Who Can Compete?

Alexa Dominance
Amazon Echo devices now have a foothold in most American homes.

Voice control is the ‘holy grail’ of UI interaction. You need only look at old movies and television to see that voice is indeed king. [For example, the Robinson family used voice commands to control their robot. And Heywood Floyd used voice as his means of teaching and communicating with HAL.] Today, there are many voice assistants available on the market. These include: Amazon Alexa, Apple Siri, Google Assistant (aka Google Home), Microsoft Cortana, Nuance Nina, Samsung Bixby, and even the Voxagent Silvia.  But the real leaders are only now starting to emerge from this crowded market. And as of this moment, Alexa dominance in third-party voice integration is apparent.

Apple Creates The Market

Apple was the first out-of-the-gate with the Apple Siri assistant. Siri first arrived on the iPhone and later on the iPad. But since its introduction, it is now available as part of the entire Apple i-cosystem. If you are an Apple enthusiast, Siri is on your wrist (with the watch). Siri is on your computer. And Siri is on your HomePod speaker. It is even on your earbuds. And in the past six months, we are finally starting to see some third-party integration with Siri.

Amazon Seizes The Market

Amazon used an entirely different approach to entrench its voice assistant. Rather than launch the service across all Amazon-branded products, Amazon chose to first launch a voice assistant inside a speaker. This was a clever strategy. With a fairly small investment, you could have an assistant in the room with you. Wherever you spent time, your assistant would probably be close enough for routine interactions.

This strategy did not rely upon your phone always being in your pocket.  Unlike Apple, the table stakes for getting a voice assistant were relatively trivial. And more importantly, your investment was not limited to one and only one ecosystem.  When the Echo Dot was released at a trivial price point (including heavy discounts), Alexa started showing up everywhere. 

From the very outset, an Amazon voice assistant investment required funds for a simple speaker (and not an expensive smartphone). You could put the speaker in a room with a Samsung TV. Or you could set it in your kitchen. So as you listened to music (while cooking), you could add items to your next shopping list.  And you could set the timers for all of your cooking.  In short, you had a hands-free method of augmenting routine tasks.   In fact, it was this integration between normal household chores coupled with the lower entry price that helped to spur consumer purchases of the Amazon Echo (and Echo Dot).

A second key feature of Amazon’s success was its open architecture. Alexa dominance was amplified as additional hardware vendors adopted the Alexa ecosystem. And the young Internet-of-Things (IoT) marketplace adopted Alexa as its first integration platform. Yes, many companies also provided Siri and Google Assistant integration. But Alexa was their first ‘target’ platform.

The reason for Alexa integration was (and is) simple: most vendors sell their products through Amazon. So vendors gained synergies with their main supplier. Unlike the Apple model, you didn’t have to go to a brick and mortar store (whether it be the Apple Store, the carriers’ stores, or even BestBuy/Target/Walmart).  Nor did a vendor need to use another company’s supply chain. Instead, they could bundle the whole experience through an established sales/supply channel.

Google Arrives Late To The Party

While Apple and Amazon sparred with one another, Google jumped into the market. They doubled-down on ‘openness’ and interoperability.  And at this moment, the general consensus is that the Google offering is the most open. But to date, they have not gained traction because their entry price was much higher than Amazon’s. We find this to be tremendously interesting. Google got the low price part down when they offered a $20-$30 video streamer.

But with the broader household assistant, Google focused first upon the phone (choosing to fight with Apple) rather than a hands-free device that everyone could use throughout the house. And rather than follow the pricing model that they adopted with the Chromecast, Google chose to offer a more capable (and more expensive) speaker product. So while they used one part of the Amazon formula (i.e., interoperability), they avoided the price-sensitive part of the formula.

Furthermore, Google could not offer synergies with the supply chain. Consequently, Google still remains a third-place contender. For them to leap back into a more prominent position, they will either have to beat ‘all-comers’ on price or they will have to offer something really innovative that the other vendors haven’t yet delivered.

Alexa Dominance

Amazon dominance in third-party voice integration is apparent. Not only can you use Alexa on your Amazon ‘speakers’, you can use it on third-party speakers (like Sonos). You can launch actions on your phone and on your computer. And these days, you can use it with your thermostat, your light bulbs, your power sockets, your garage door, your blinds, and even your oven. In my case, I just finished integrating Alexa with Hue lights and with an ecobee thermostat.

Bottom Line

Market dominance is very fleeting. I remember when IBM was the dominant technology provider. After IBM, Microsoft dominated the computer market. At that time, companies like IBM, HP, and Sun dominated the server market. And dominance in the software market is just as fleeting. Without continually focusing on new and emerging trends, leadership can devolve back into a competitive melee, followed by the obsolescence of the leader. Indeed, this has been the rule as dominant players have struggled to maintain existing revenue streams while trying to remain innovative.

Apple is approaching the same point of transition. Their dominance of the phone market is slowly coming to an end. Unless they can pivot to something truly innovative, they may suffer the same fate as IBM, Sun, HP, Dell, Microsoft, and a host of others.

Google may be facing the same fate – though this is far less certain. Since Google’s main source of revenue is ‘search-related’ adverstising, they may see some sniping around the edges (e.g., Bing, DuckDuckGo, etc). But there is no serious challenge to their core business – at this time.

And Amazon is in a similar position: their core revenue is the supply chain ‘tax’ that they impose upon retail sales. So they may not see the same impact on their voice-related offerings. But they dare not rest upon their laurels. In candor, the Amazon position is far more appealing than the Google position. The Amazon model relies upon other companies building products that Amazon can sell. So interoperability will always be a part of any product that Amazon brands – including voice assistants. 

Only time will sort out the winners and losers. And I daresay that there is room enough for multiple ‘winners’ in this space. But for me, I am now making all of my personal and business investments based upon the continued dominance of Alexa.

Home Automation “Quest for Fire”

Home-Automation-Diagram
Home Automation

This weekend, we took another step in our home automation quest. We have used smart switches (for lamps), smart thermostats, smart music, smart cars, and even smart timers. But until Saturday, we did not have any smart lights, per se. On Saturday, we bought some Philips Hue lights (and the associated hub). That means that we now have Ethernet (i.e., wired) devices, Wifi devices, and now Zigbee devices.

Is this a big deal? The answer to that is somewhat nuanced. We’ve had smart home puzzle pieces for a while. And we almost bought a Z-Wave infrastructure to put smart switches in place. But the age of our house makes this impractical. [We don’t have neutral wires on any switches in the house. And the price to refurbish these switches would be prohibitive.]  So our home automation quest stalled. But on Saturday, I could take it no more. When we went out on errands, we stopped and picked up five (5) Hue lights.

Just Add Lights

The installation and setup was simple. It took almost no time to get everything installed and paired. And within a little more than an hour, we had functioning lights in the second floor hallway and in our master bedroom.  Over the next year, we can start to populate the various ceiling fans in the house. I figure that we can do this whenever we need to replace the incandescent bulbs that are currently installed. Given our current pace of replacement, I’m figuring that it will take a year or so to retrofit the house.

After getting everything installed, I started to make an inventory of our various smart home investments. As of today, we have the following pieces:

Current “On-Premises” Infrastructure

Today, we have so many physical (and logical) pieces in our home automation puzzle:

  • Network: Cisco network switch, Cisco VPN appliance, Netgear router, NordVPN proxy, Raspberry Pi ad blocking, Raspberry Pi DNS
  • Print: Hewlett-Packard printer
  • Entertainment: Plex media server (on PC desktop), Roku media player, Samsung TV, Silicon Dust HDHomeRun player
  • Storage: Synology storage, WD MyCloud storage
  • IoT: Amazon Echo Dot speakers, Huawei sensor/camera (on surplus phone), Kia Soul, Personal location / presence (on personal phones), Philips Hue lights, Raspberry Pi home automation appliance, TP-Link Kasa switches, WeightGURUS scale

Current “Off-Premises” Services

While we have lots of smart pieces in the house, we also have more than a few external cloud services providers. In most of these cases, these services allow us to extend “access” beyond the confines of our network. Our current list of services includes:

  • Lobostrategies Business: Bluehost, GoDaddy
  • Olsen Personal: Amazon Alexa, Dropbox, Google Drive, Google GMail, Home Assistant cloud, IFTTT cloud, Plex cloud, Pushbullet cloud, TP-Link Kasa cloud, WD MyCloud

So after adding yet another home automation “category” to the premises, we learned an important lesson: external access requires a measure of trust – and diligence. If you aren’t willing to secure your devices, then you must accept the consequences of an electronic intrusion.

Application Security: Yet Another Acronym as a Service (YAAaaS)

Over the past dozen or so years, we have seen the emergence of Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS). In fact, there are dozens of “as a Service” acronyms. All of these terms have sprung from the service-oriented architecture (SOA) movement of the nineties. These days, I think of the ‘aaS’ acronyms as ‘table stakes’ in the competitive world of IT. You can think of them as ‘value containers’ where data and process are combined into a ‘service’ that you can purchase in the marketplace. Today, almost anything can be purchased “as a service” – including application security.

The Push Against Commoditization

I sometime think of IT as a cathedral where the priests are consulted, birds are sacrificed, censers are set on fire, and tribute is paid to the acolytes and the priests. [Note: The notion of IT priests is not new. Eric Raymond wrote about it in “The Cathedral and the Bazaar” (a.k.a., CatB). For those that are part of the ecclesiastical hierarchy (i.e., the tech elites), the priesthood is quite profitable. And for them, there is little incentive to commoditize the process of IT.

In the nineties, the process of IT delivery required 8A consultants – and legions of IT staffers. The final result of this kind of expensive IT is commodity IT. Indeed, the entire migration towards outsourcing was a response (by the business) to inflexible and expensive IT. Because of this, IT has been locked in a struggle against the inevitable. As more individuals have gotten into the IT business, prices have dropped – sometimes calamitously. Consequently, IT has kept the wheel spinning by creating newer and better “architectures” that can (ostensibly) propel IT technology and services ever forward.

The Inexorable Victory of Commoditization

We are now starting to see the ‘aaS’ movement move toward higher-order functions. In the past, IT commoditized the widgets (like systems, storage, and networks). Recently, IT has transformed its own business through service management, streamlined development, and continuous process improvement. Now, businesses (and IT) are commoditizing more complex things – like applications. This includes communications (email and collaboration), sales force (e.g., SAP), procurement (e.g., SAP, Oracle, etc), operations management, service management (i.e., service desks), and even strategic planning (through data mining, business intelligence, and “Big Data” initiatives).

And today, even services such as data security, identity management, and privacy are being transformed on the altar of commoditization. In the enterprise space, you can buy appliances for DNS, for identity management, for proxy services, for firewalls, and for content management (like ad blocking and virus/malware detection). You can even go into a Best Buy and purchase the Disney Circle to ensure that your kids are safe at home.

Security and Application Security

The infrastructure components of enterprise security have been commoditized for almost two decades. And if you knew where to look, you might have found personal items (like Yubikeys) as a means of performing two-factor authentication. But now, Google is going to sell security tokens. [Note: This is just like their entry into video streaming market with the Chromecast.] This marks the point where identity management is becoming a commodity.

At the same time, security services themselves are being commoditized. In particular, you can now deploy security systems in your house without needing any security certification (i.e., Security+, CISSP, etc). You can buy cameras, motion detectors, door/window sensors, and alarm systems either with or without contracts. The new guys on the block (e.g., SimpliSafe) and the “big boys” (like Comcast) are all getting into the business of monitoring your household – and ensuring your security.

As for me, I’ve been plugging all sorts of new application-layer security features into my infrastructure.  I added DNS security to my infrastructure through using a third-party service (i.e., Cloudflare). I implemented identity management capabilities on my site. I’ve tested and deployed two-factor authentication. And I’ve added CAPTCHA capabilities for logins, comments, and contact requests. For lack of a better term, I’m calling all of this Application Security as a Service (i.e., ASaaS).

Bottom Line

I’m not doing anything new. Indeed, these kinds of things have been part of enterprise IT for years. But as a business owner/operator, I can now just plug these things into an owned (or leased) infrastructure. I don’t need a horde of minions to build all of this. Instead, I can build security into my business by simply plugging the right application security elements into my site.

Obviously this is not yet idiot proof. There is still a place for “integrators” who can stitch everything together. But with every passing day, I feel even more like my wife – who is a quilter. Architects design and use ‘patterns’ in order to construct the final product. The supply chain team buys commodity components (like the batting and the backing). Developers then cut out the pieces that make the quilt. Integrators then stitch these together – along with the commodity components.  IT takes the disassembled pieces to someone else who can machine “quilt” everything together. In the end, the “quilt” (i.e., the finished product) can be completed at a tremendously reduced price.

Ain’t commoditization grand?!

Security Theater at Black Hat 2018

implantible-devices-security-theater
Wireless Security Theater

Security is a serious business. And revealing unknown flaws can make or break people – and companies. This is especially true in the healthcare industry. As more health issues are being solved through the use of  implantable technologies, security issues will become even more important. But when do “announcements” of implant vulnerabilities go from reasonable disclosure to security theater?

When my wife sent me a link to a CNBC article entitled “Security researchers say they can hack Medtronic pacemakers”, I took notice. As posted previously, I have been a cyborg since July 2002. And in 2010, I received a replacement implant. At the time, I wondered whether (of if) these devices might be hacked. After all these devices could be programmed over-the-air (OTA). Fortunately, their wireless range was (and still is) extremely limited. Indeed, it is fair to say that these devices have only “near-field communications” capability. So unless someone could get close to a patient, the possibility of a wireless attack is quite limited.

But as technology has advanced, so too have the threats of exploitation. Given recent technology advances, there was a fair chance that my device could be hacked in the same way that NFC chips in a mobile phone can be hacked. In fact, when I cross-referenced the CNBC article with other articles, I saw a picture of the very same programmer that my cardiologist uses for me. It was the vert same picture (from Medtronics) that I had posted on my personal blog over eight years ago. So as I opened the link from my wife, my heart was probably beating just a little more quickly. But I was relieved to see that CNBC was guilty of succumbing to the security theater that is Black Hat Vegas.

In this case, the Black Hat demonstrators had hacked a “programmer” (i.e., a really fancy laptop that loads firmware to the implantable device). The demonstrators rightfully noted that if a ‘bad actor’ wanted to injure a specific person, they could hack the “programmer” that is in the doctor’s office or at the hospital. And when the electro-physiology tech (EPT) did a “device check”, the implanted device (and the patient) could be harmed.

This is not a new risk. The programmer (i.e., laptop) could have been hacked from the very start. After all, the programmer is just a laptop with medical programs running on it. It is altogether nothing fancy.

The real risk is that more and more device-assisted health treatments will emerge. And along with their benefits, these devices will come with some risks. That is true for all new technologies – whether medical or not. There is a risk of bad design, or software bugs, or poor installation, or inattention to periodic updates. And there is a risk that this technology might be exploited. Of course, the fact that a pacemaker might be subject to failure during an EMP does not mean that the device should never be used.

It’s just a risk.

Fortunately, this is no different than the countless number of risks that we take every day. We trust car designers, driving instructors, other drivers, and even the weather forecasters whenever we drive our cars. And the threat that our cars are run by computers – and can necessarily be hacked – doesn’t prevent everyone from driving. 

Let’s leave the security theater in Vegas. And let’s leave the paranoia to professionals – like Alex Jones.


Browser Security Bypasses Abound

browser security at risk
Browser Security At Risk

Browser Security Threats Discovered

According to the Catholic University of Leuven in Belgium (KU), every modern browser is susceptible to at least one method of bypassing browser security and user privacy.  In an article on the subject, Catalin Cimpanu (of BleepingComputer) reported that new (and as yet unexploited) means of bypassing cookie controls are apparently possible.  KU researchers reported their findings to the browser developers and posted their results at wholeftopenthecookierjar.eu.

Don’t expect all browser vendors to solve all browser security issues immediately. Indeed, expect many people to howl about how these vulnerabilities were reported. But regardless of the manner in which the news was delivered, every customer must take it upon themselves to implement multiple layers of protection. A comprehensive approach should (at a minimum) include:

  1.  A safe browser,
  2. Safe add-ons (or extensions) that include cookie and browser element management (e.g., uBlock Origins, NoScript, and uMatrix)
  3. A means of reducing (and possibly eliminating) Javascript, and
  4. Effective blocking of “well-known” malware domains.
Bottom Line

Shrek was right.  Ogres are like onions – and so is security. Effective security must include multiple layers. Be an ogre; use layers of security

Browser Security: Who Do You Trust?

Browser Security Defended by Mozilla

So you think that you are safe. After all, you use large, complex, and unique passwords everywhere. You employ a strong password safe/vault to make sure that your passwords are “strong” – and that they are safe. At the same time, you rely upon multi-factor authentication to prove that you are who you say that you are. Similarly, you use a virtual private network (VPN) whenever you connect to an unknown network. Finally, you are confident in your browser security since you use the “safest” browser on the market.

Background

Historically, geeks and security wonks have preferred Mozilla Firefox. That’s not just because it is open source. After all Google Chrome is open source. It’s because Firefox has a well-deserved reputation for building a browser that is divorced from an advertising-based revenue stream. Basically, Firefox is not trying to monetize the browser. Unlike Chrome (Google) and Edge (Microsoft), Firefox doesn’t have an advertising network that must be “preferred” in the browser. Nor does Firefox need to support ‘big players’ because they are part of a business arrangement. Consequently, Firefox has earned its reputation for protecting your privacy.

But as Robert “Bobby” Hood has noted, the browser that you choose may not make much difference in your browser security posture. He wrote more bluntly; he said, “[Browser difference] …doesn’t matter as much as you may think… Is it important which browser we use? Sure, but with a caveat. Our behavior is far more important than nitpicking security features and vulnerabilities.” He is right. There are far more effective means of improving security and ensuring privacy. And the most important things are your personal practices. Bobby said it best: “Would you park your Maserati in a bad part of town and say, ‘It’s okay. The doors are locked!’ No. Because door locks and alarm systems don’t matter if you do dumb things with your car.”

What Have You Done For Me Lately?

It is always good to see when one of the browser creators takes positive steps to improve the security of their product. On August 16th, Catalin Cimpanu highlighted the recent (and extraordinary) steps taken by Mozilla. In his article on BleepingComputer (entitled “Mozilla Removes 23 Firefox Add-Ons That Snooped on Users”), he highlighted the extraordinary steps take by Mozilla’s addons.mozilla.org (AMO) team. In particular, they researched hundreds of add-ons and they determined that twenty-three (23) of them needed to be eliminated from AMO. Mozilla removed the following browser plugins from AMO [Note: These include (but aren’t limited to…]:

  • Web Security
  • Browser Security
  • Browser Privacy
  • Browser Safety
  • YouTube Download & Adblocker Smarttube
  • Popup-Blocker
  • Facebook Bookmark Manager
  • Facebook Video Downloader
  • YouTube MP3 Converter & Download
  • Simply Search
  • Smarttube – Extreme
  • Self Destroying Cookies
  • Popup Blocker Pro
  • YouTube – Adblock
  • Auto Destroy Cookies
  • Amazon Quick Search
  • YouTube Adblocker
  • Video Downloader
  • Google NoTrack
  • Quick AMZ

Mozilla also took the extraordinary step of ‘disabling’ these add-ons for users who had already installed them. While I might quibble with such an ‘authoritarian’ practice, I totally understand why Mozilla took all of these actions. Indeed, you could argue that these steps are no different than the steps that Apple has taken to secure its App Store.

Bottom Line

In the final analysis, browser security is determined by the operation of the entire ecosystem. And since very few of us put a sniffer on the network whenever we install a plugin, we are forced to “trust” that these add-ons perform as documented. So if your overall browser security is based upon trust, then who do you trust to keep your systems secure? Will you trust companies that have a keen interest in securing ‘good’ data from you and your systems? Or will you trust someone who has no such vested interests?

DNS Security: The Final Chapter, For Now

DNS Security Challenges
DNS Security Challenges

As a man of faith, I am often confronted with one sorry truth: my desires often exceed my patience. So it was with my extended DNS security project. I have written three out of four articles about DNS security. But I have taken a detour from my original plan.

The first article that I wrote outlined the merits of using the Trusted Recursive Resolver that showed up in Firefox 61. I concluded that the merits of encrypting DNS payloads were obvious and the investment was a necessary one – if you want to ensure privacy. The second article outlined the merits (and methods) of using DNS -Over-HTTPS (DoH) to secure references to external referrers/resolvers. In the third article, I outlined how I altered my DNS/DHCP infrastructure to exploit DNSMasqd.

That left me with the final installment. And the original plan was to outline how I had implemented Unbound as a final means of meeting all of my DNS security requirements. Basically, I had to outline why I would want something other than a simple DNS referral agent. That is a great question. But to answer that question, I need to provide a little background.

DNS Background

The basic DNS infrastructure is a hierarchical data structure that is traversed from top to bottom (or right to left when reading a domain name). When a customer wants to know the IP address of a particular device, the top-level domain (TLD) is queried first. So if looking for www.lobostrategies.com, one must first search the registry for all ‘.com’ domains. The authoritative server for ‘.com’ domains contains a reference to the authoritative DNS server for lobostrategies.com (i.e., GoDaddy).

The next step is to search the authoritative domain server for the address of the specific server. In my case, GoDaddy would be queried to determine the address for www.lobostrategies.com. GoDaddy would either answer the question or send it to a DNS server supporting the next lower level of the domain hierarchy. Since there are no subdomains (for lobostrategies.com), GoDaddy returns the IP address.

The ISP Advantage

The process of searching from the root to the proper branch (that answers the query) is called recursive searching. And it is the heart of how DNS works. But this burden is not carried by every user. Can you imagine if every user queried the top-level domain servers? It would be an incredible volume of queries. Instead, the results of most queries are stored (i.e., cached) at lower levels of the tree. For example, companies like your cable ISP (or Google, or Cloudflare, or OpenDNS) will be your ‘proxy’ for all requests between you and the host name that you want to resolve into an IP address.

Your ISP has almost every result of top-level domain queries already stored in its cache. So your answer would be delivered with at least one fewer step than it would have required for you to ask the question yourself. And since most public DNS resolvers have massive results already cached, you would never have to go to GoDaddy to get the IP address for my website. So rather than issuing a query to the root and a query to GoDaddy, your ISP can just provide the address directly to you – reducing your name search activity in half. Therefore, most users consult a DNS service that does the searching for them.

Hidden Costs

But think about what it is costing you to do the search yourself. The first time you query a domain, it takes time (and cache memory). But after the one-time ‘charges’ are paid, isn’t running my own recursive DNS a low-cost investment? Yes it is, and no it isn’t. The real cost of DNS is the hidden cost of privacy. If you run your own recursive DNS server, then you have to pay the entry costs (in hardware and in slower initial resolve times).

If you ‘trust’ someone else to do this for you, then they have access to all of your DNS queries. They know who you are and what you are trying to see. They won’t know what you saw in your query to a specific site. But they will know that you wanted to know where a particular site could be found.

Bottom line: To use a DNS resolver/referrer, you are necessarily letting that service provider know about your probably website destinations.

By using a recursive DNS, you are only letting the domain owner for a site know that you are looking for their address. Google would only get query data when you were intending to connect to Google services. So Google would only see a subset of your DNS queries – thereby improving DNS security.

On the flip side, you really do want a service that will encrypt the DNS payload. Recursive DNS tools (like the Unbound tool in DD-WRT and Pi-hole) do not yet support robust encryption for their recursive queries. Indeed, only two DNS providers currently support DOH (i.e., Google and Cloudflare). By selecting to use a recursive DNS that you manage yourself, you are limiting the ability to mask DNS search requests as they move across the Internet. In practice, this means that you will have a higher risk of being exploited by a man-in-the-middle (MIM) DNS attack. And this includes things like DNS spoofing.

The Choice

So I was faced with a simple choice: 1) I could implement a solution with encryption to a trusted recursive DNS provider, or 2) I could pay the upfront price of running my own recursive DNS. When I started to write this series of articles, I was feeling very distrustful of all DNS providers. So I was leaning towards running my own recursive DNS and limiting the search data that my selected provider could exploit. But the more that I thought about it, the more that I questioned that decision. Yes, I don’t trust companies to place me above their bottom line. And I don’t want the ‘gubmint’ to have a choke point that they could exploit. After all, didn’t the 2016 presidential campaign demonstrate that both parties want to weaponize the information technology?

But the fear of all companies and all politicians is a paranoid conceit. And I don’t want to be the paranoid old man who is always watching over his shoulder. More importantly, the real challenge / threat is the proven risk that script-kiddies, hackers, and criminals might target my data while it is in transit. So as I compared a paranoid fear versus a real fear, I started moving towards desiring encrypted DNS queries more than limiting third-party knowledge of my queries.

The Choice Deferred

Just as I was about to implement changes based upon a re-assessment, I inadvertently shot myself in the foot. I was listening to a podcast about information security (i.e., Security Now by Steve Gibson) and I heard about a resurgence of router-based password exploits. I had long ago switched to a password manager. So I wanted more entropy in my randomized password. I checked online to see if there were any password standards for DD-WRT. I found nothing. So I figured that if the software didn’t like a password, then it would stop me before implementing the change.

I plunged ahead and created a 64-character randomized password. The software took the change – even validating the password against a second-entry of the password. But when I went to log back in to the router, my shiny new password was rejected.

Wash, Rinse, Repeat

I was getting frustrated. I looked to see if there was any way to revert back to an older password. But there was no such capability. And the only way to log back into my router would be to factory-reset the device – which I did. But it took a very long-time to recover (~2.5 hours). So after a few hours, I was back to where I started.

Before I tried again, I backed up the non-volatile memory (nvram). Then I decided to try a shorter password. After failing with the 64-character password, I tried a 32-character password. Unfortunately, it resulted in an inaccessible router. So I restored my router and then I was back to the starting line, again.

After researching the issue for forty-five minutes, I found someone that had run into the same problem. They had solved it by using a twenty-two (22) character password. So I earnestly tried to change the password to an eighteen (18) character password. I was hopeful; my hopes were crushed. But I was getting good at doing the factory reset. So after three attempts and almost five (5) hours of effort, I went back to the old password format that I had used before. Lo and behold, it worked.

Overall DNS Security Improvements

After spending the bulk of an evening on this effort, I was glad to be back on track. But I had a fresh appreciation for doing as little as possible to my router and my DNS infrastructure. I already has a working DNS that used DOH to communicate with Cloudflare. And I had done enough research to be less skeptical of the Cloudflare DNS (when compared to ISP DNS and Google DNS).

I now have a DNS service separated from my router. And the DNS and DHCP systems are running on a unified system – thus making reverse-lookups and local queries far more effective. Finally, I have a DNS request facility that should be more secure against man-in-the-middle attacks. So without much more fanfare, I will call this DNS security battle a victory – for now. And I will keep my eyes wide open – both against corporate/government exploitation and against self-inflicted computing wounds!

Router Security: Another One Bites The Dust

router security cryptojacking
Poor Router Security Assists Cryptojacking
Hackers are often successful because their victims are not very vigilant.

One thing that hackers have learned is that most people don’t update the software on their devices. This includes users failing to implement fixes that improve router security and close router vulnerabilities. Last week, we learned of yet another example where hackers exploited a ‘solved’ vulnerability to inject malware onto systems. In this case, bad actors were using MicroTik routers as a means of spreading the Coinhive malware.

But as is the case these days, the malware did not just exploit inadequate router security practices. The malware used the compromised routers to re-write web pages in order to propagate the coin mining software to unsuspecting sites/users. As dreadful as this sounds, MicroTik had a patch for this after their last serious exploit. It’s too bad that the patch was never pushed to their customers’ devices. if this had been done automatically (or if the customers had done it for themselves), then there would never have been the most recent Coinhive exploit.

Similarly, if the users had ensured secure connections to all web sites (by using https), then there is a good chance that the compromised sites (and connections to other distrusted sites) might have been noticed. In addition, if users had blocked active scripting from within their browsers, then Coinhive would never have gained a foothold.

The solutions to these problems are relatively simple.

Take These Steps:

  1. If your router supports automatic updating, then activate that feature. Don’t wait to turn automatic updating on. Do it now!
  2. Always use https when accessing any web site. This would have hindered the propagation of the infection. The Electronic Frontier Foundation (EFF) has a good tool to ensure this called HTTPS Everywhere.
  3. Disable scripting for all sites EXCEPT those that you trust. You can do this by using tools like NoScript or uMatrix. 

Finally, we are now aware that routers are a common vector for hackers to exploit. That’s because everyone has a router and very few routers have automatic updating capabilities. Knowing that very few people take the time to update their own routers, most router vendors should require automatic updates – unless de-activated by the user. By the way, this is what Microsoft finally did to address some dramatic security weaknesses in the Windows operating system.

Don’t rest upon your past efforts to protect your assets. And whatever you do, don’t be the slowest gazelle. Update your infrastructure.

DNS Security At The Edge

DNS-Security-From-The-Edge
DNS Security From The Edge

In my third installment of this series about DNS security, I am focusing on how we can audit all DNS requests made in our network.  In the second installment, I focused on secure communications between DNS servers. And I highlighted the value of DNSSEC and the potential of DNS-Over-HTTPS (DOH). I outlined some of the changes that we’ve made – including the deployment of DNSMasqd on our router and on our Pi-hole. Today, I will focus upon ensuring that end-to-end auditing is possible – if desired.

As noted previously, we upgraded our router (i.e., a Netgear R8000 X6) from stock firmware to DD-WRT. We did this for numerous reasons. Chief among these reasons was the ability to set up an OpenVPN tunnel from our network out to our VPN providers’ VPN endpoints. But a close second was a desire to run DNSMasqd on the router. Since we haven’t chosen to move DHCP functions to the Pi, we wanted a DHCP service that would have better capabilities. More importantly, we wanted to update DHCP option #6 so that we could control what name servers our clients would use. In our case, we knew that all clients would use our Pi-hole as their name server.

After we realized that DNS options on the admin panel didn’t apply to DHCP clients, we figured out how to set name servers for all of our DHCP clients. Once done, all clients began direct interaction with the Pi-hole. [They had previously used the router as a DNS proxy to our Pi-hole system.]

But as is often the case, there were unforeseen consequences. Specifically, reverse lookups (for static address) failed. This meant that DNS security would suffer because we couldn’t correlate elements across the entire request chain. We could have moved dhcpd to the Pi-hole. But we wanted to have a DNS fall-back – just in case the Pi-hole failed. So we changed our processes for assigning static addresses. Now, we will add them to the router as well as to the /etc/hosts file on the Pi-hole. Once implemented, we had clear visibility between request origination and request fulfillment. [Note: We will probably turn this off as it defeats the very anonymity that we are trying to establish. But that is for another day.]

So what’s left?  In the first two articles, we set up DNS authentication wherever possible. We also established DNS payload encryption – using DNS-Over-HTTPS. Now we have a means of authenticating DNS server communications. And we have an encrypted payload. But there is one last need: we have to limit the ‘data at rest’ stored by each DNS resolver.

Consider this: If we validate our connection to Google, and we encrypt the traffic between us, then Google can still look at all of the DNS queries that it receives from us. That is fine if you trust your principal DNS resolver. But what if you don’t? A more secure process would be to ask for name resolution directly, rather than through a trusted (or un-trusted) intermediary. In my next article, I will discuss how we implemented a recursive DNS resolver alongside our Pi-hole. Specifically, I’ll talk about using Unbound to limit the amount of ‘data at rest’ that we leave with any single DNS service.