A few days ago Miki sent me an article about Tanium giving prospective customers a look into their client hospital’s live network, but without permission or protecting the identity of the hospital completely.
I wrote her back today as follows.
I had not seen this on my own, but I have been reading about the company for a few days now.
Coming from the medtech industry and security specifically I will say this.
The fact that he and his company used live hospital data without their consent will be a deathblow to them.
Hospitals take this very seriously because they are the ones who are held responsible by the Office for Civil Rights under Health and Human Services.
The hospital will be shown to have a vulnerability and will be forced to pay fines, lose out on government funds and potentially face sanctions.
As a result the rest of the healthcare industry will treat Tanium like a pariah because they will not want to face repercussions.
Regardless of the industry it’s shocking to see how folks think it’s ok to manipulate or abuse customer relationships for their own profit, it always ends badly.
Sadly, I think they will find a way to smooth it over. Google, Facebook, etc sell customer data all the time. It’s how so many make their money and no one seems to care.
I know HIPPA is supposed to prevent this stuff, but I’m sure companies are getting around that, too, they just haven’t been caught, yet.
That’s the key, not being caught.
Every company that is caught, or just challenged, cries that they take their customer’s privacy seriously or that that’s not what their culture stands for, etc.
But only when they are caught.
I sincerely hope you are correct and that Tanium takes a major blow and, more importantly, that the CEO is forced out, but I’m not holding my breath. I guess I’ve finally gotten pretty cynical about this stuff.
So now I’m trying to decide if Miki’s cynicism is warranted or if I’m right and the publicized results of Tanium’s actions will have the effect they should.
I’ll keep you informed as there are more developments.
Nintendo’s new Switch console — think Zelda — is making news, but its unique security effort should be in the limelight, too.
Unlike Tide, Nintendo realized the console’s tiny, SD-sized game cartridges would be irresistible to kids — so its designers came up with the perfect solution.
They didn’t wait for a curious kid (and the resulting lawsuit) to choke or even die from swallowing one, before addressing it.
They thought it through and spent the needed time and money to assure that kids wouldn’t eat the cartridges in the first place.
And they succeeded.
The cartridges are coated with something that makes them taste terrible.
Terrible as in spitting them out.
“To avoid the possibility of accidental ingestion, keep the game card away from young children,” a Nintendo spokesperson told Kotaku. “A bittering agent (denatonium benzoate) has also been applied to the game card.” (The agent is non-toxic.)
Adults, too. Hilariously, it was an adult game reviewer who decided to lick the cartridge.
I put that Switch cart in my mouth and I’m not sure what those things are made of but I can still taste it. Do not try this at home.
In addition to storing the customer databases in a publicly accessible location, Spiral Toys also used an Amazon-hosted service with no authorization required to store the recordings, customer profile pictures, children’s names, and their relationships to parents, relatives, and friends.
I had an interesting conversation today with a Director of IT Security from a large healthcare provider in Delaware who is a customer of mine.
The conversation was mostly to do with what his daily responsibilities were, how he balanced competing priorities and to gain a better understanding of his particular challenges.
I went into this meeting with my only desire to better understand him as a person and see how I could be of better value to him as my customer.
I did not expect to come away from the conversation with real world cases of how culture within an organization can change over time, but I have found when you keep your ears open it is surprising what people will say.
Some of you may have experience with healthcare providers, either as a patient or perhaps in a business relationship. I am sure that one thing we can all agree upon is that as a rule they can be slow to adopt, adapt and mature.
This may be hardwired into the DNA of the organization. I know that when I break my leg a doctor will put a cast in it because that has been proven to work through millions of previous experiences.
This can be the desired outcome versus the doctor that decides to try a different remedy for every broken leg.
As I was speaking with my customer he said one thing that struck me. He said, “slow is smooth, smooth is fast.”
He was saying this in reference to his desire to shape the culture to be more security conscience. However, he understood that if radical changes were made overnight he would lose the support of the organization. Instead he was implementing incremental changes over time to affect change.
Isn’t this the desired outcome?
As I think through this, there are times when radical change is needed, but typically it’s at the personal level that it is achieved.
Obvious examples being taking up exercise, limiting the amount of alcohol or taking up a new routine.
Try and push that on your friends or family overnight and good luck!
It takes time and buy-in from the group to effect lasting change.
That leaves us with a question that I do not yet have the answer to.
Last summer, Bill Marczak stumbled across a program that could spy on your iPhone’s contact list and messages—and even record your calls. Illuminating shadowy firms that sell spyware to corrupt governments across the globe, Marczak’s story reveals the new arena of cyber-warfare.
Marczak’s stumble revealed three zero-day exploits (“Zero days” refers to the amount of time—i.e., none—a target has to fix an entirely new kind of hack before damage can be done.).
It’s called a jailbreak and the ability to do it remotely is every hacker’s dream.
… the ability to hack remotely into the digital brains of the world’s most popular hardware—the desktops, laptops, tablets, and especially the mobile phones made by Apple. And not just break into Apple devices but actually take control of them. It was a hacker’s dream: the ability to monitor a user’s communications in real time and also to turn on his microphone and record his conversations.
In a superhuman effort, Apple patched all three exploits in just 10 days.
It’s an uplifting story, but the fact is Apple and other computer-makers are fighting a losing battle. As long as there are hackers, they will continue to find ways to hack any device that interfaces with them. These dangers were highlighted this fall when a New England company found itself the target of a mass denial-of-service attack from millions of non-computer “zombie devices” connected to the Internet—most notably baby monitors.
“What these cyber-arms dealers have done is democratize digital surveillance,” says the A.C.L.U.’s Chris Soghoian. “The surveillance tools once only used by big governments are now available to anyone with a couple hundred grand to spend.” In fact, they may be coming to your iPhone sometime soon.
A Friday series exploring Startups and the people who make them go. Read allIf the Shoe Fits posts here
Startups love to rail against regulations, claiming they stifle innovation.
Uber and Airbnb are two of the most aggressive fighting them, not to mention the loudest.
What do you think?
Do you believe that eliminating/diluting regulations would provide the necessary boost to bring innovations to fruition?
Uber and Airbnb brazenly ignored regulations and, when that didn’t work, took their fight to the court of public opinion, lobbied for legal change and sued.
Would eliminating regulations have made Theranos’ blood tests work and produced a better outcome for its customers?
Autonomous and semi-autonomous cars are another battlefield.
And for all its high-profile supporters, millions of people around the globe are concerned with safety — with good reason.
Obviously, regulations aren’t all bad, especially when when the cost of ignoring or eliminating them could be measured in lives lost.
Regulations are something that startup CEOs need to deal with and most do.
Most, but not George Hotz.
When he received a letter from the National Highway Traffic Safety Administration found a third option — turn tail and run.
Comma.AI, a startup run by famous hacker George Hotz, has shut down its project dedicated to building a Tesla-like semi-autonomous driving system after a warning from the federal government. (…) The cancellation was prompted by a letter Comma.AI received from the , which asked the startup to provide information to ensure the product’s safety or face civil penalties of up to $21,000 a day.
Considering the product was a $1000 DIY semi-autonomous kit the market would likely be huge.
It seems reasonable to me to ask for proof it was safe, just as Theranos was asked for proof.
However, unlike Theranos’ CEO, Hotz didn’t dance, blow smoke or wave mirrors — he turned tail and claimed a pivot.
Would much rather spend my life building amazing tech than dealing with regulators and lawyers. It isn’t worth it. -GH 2/3
It’s amazing to me, but looking back at more than a decade of writing I find posts that still impress, with information that is as useful now as when it was written.
Golden Oldies is a collection of what I consider some of the best posts during that time.
I wrote this Halloween post exactly 10 years ago and the costume is even scarier today. The character described has added to their tricks list, including hospitals, connected cars, IoT devices and ransomware, to name just a few.
Happy Halloween! In case you’ve got party plans and want to be a really scary character sans blood and guts.
The costume is almost anything handy, but ratty jeans, well-worn black t-shirt, preferably with an anti-social message, worn sneakers, scruffy hair, and red-rimmed eyes is the norm; or you can go all the way over to pure designer if that’s your thing. The only necessary accessory is a laptop (or facsimile if you think you might party hard enough to lose it). That’s it, the generic (feel free to customize it) costume of one of the scariest folks cruising along today.
I’ve been writing (ranting?) about the security dangers of IoT and the connected world in general.
Security seems to be an afterthought— mostly after a public debacle, as Chrysler showed when Jeep was hacked.
GM took nearly five years to fully protect its vehicles from the hacking technique, which the researchers privately disclosed to the auto giant and to the National Highway Traffic Safety Administration in the spring of 2010.
“With several months of in-depth research on Tesla Cars, we have discovered multiple security vulnerabilities and successfully implemented remote control on Tesla Model S in both Parking and Driving Mode.”
They hacked the firmware and could activate the brakes, unlock the doors and hide the rear view mirrors.
Tesla is the darling of the Silicon Valley tech set and Elon Musk is one of the Valley gods, but it still got hacked. And the excuse of being new to connected tech just doesn’t fly.
And if connected car security is full of holes, imagine the hacking opportunities with self-driving cars.
The possibilities are endless. I can easily see hackers, or bored kids, taking over a couple of cars to play chicken on the freeway at rush hour.
Nice girls don’t say, ‘I told you so’, but I’m not nice, so — I told you so.
“You can’t just extrapolate Google cars driving ~1.5 million miles under specific conditions (weather, topology, construction, traffic, accidents around it, etc.) to usurping the ~3 trillion miles/year under all conditions in the US. 1.09 fatalities per 100 million miles is the current non-self-driving numbers.
2014 had ~30k fatal crashes out of the 3 trillion miles traveled. We have to understand not how those crashes happened, but what makes the vast majority of them not happen. Luck is not a contributor, expertise is. Understanding human expertise is the key, not human frailty.”
Tech claims that security isn’t that big a problem and certainly not one that requires statutory approaches or regulation.
Two years ago Eddie Schwartz, vice president of global security solutions for Verizon’s enterprise subsidiary, said that self-driving cars will prove an irresistible target for hackers if they ever hit the roads.
Change if to when. Of course they’re irresistible; hacking and controlling a real car on a real road, with the potential of doing real damage, would be catnip to a large number of naïve kids (to prove they can), not to mention angry adults (getting even) and terrorists (creating chaos).
The cars aren’t yet able to handle bad weather, including standing water, drizzling rain, sudden downpours and snow, let alone police instructions (…) “I am decidedly less optimistic about what I perceive to be a rush to field systems that are absolutely not ready for widespread deployment, and certainly not ready for humans to be completely taken out of the driver’s seat.”