Better Credit Card Security

While talking with a friend, who is enduring some unpleasantness the conversation turned to issues with using credit cards to buy things, like food for example. That got me thinking, how would I design a really strong way to prevent data breaches?

Encrypt everything!

Well, perhaps not that, but hash everything. Here’s what I talked myself into, of course none of this is rational because nobody will effect a planetwide shift in payment processing based on what this yokel has to say, but still, here goes.

Issuing Bank sets up credit account, there are four key fields that are important for the classic transaction, name, number, expiration date, and CVV2. I think one could also establish a timebased one-time-password secret as well, it would operate like Google Authenticator functions. So you’d need a secret that the bank generated for their systems and the physical card too. You’d need a smart chip on the card so it could forward the TOTP code to the credit terminal at the point of sale.

The bank sets up a TOTP secret, so it’s named JQP Credit Card (or account number or whatever) and the secret is: 6B57078FB88A4DD73E447D2647DCEC7D04C3D887951BA6A2D8DBA294E0B60579. This number is forwarded to the credit card terminal. Right now it’s 726995, but in thirty seconds it’ll be something else. Since the credit card terminal and the bank share sync’ed time via time.nist.gov, there is no risk that there would be some sort of mismatch between the two.

The customer goes to the credit card terminal and swipes, a value is entered and a timestamp is recorded, all of this is already parts of a credit transaction. The terminal can read the name, expiration, CVV2, whatever from the magnetic stripe and the smart chip forwards the TOTP code, then the terminal assembles this into a EDI transaction:

JOHN/Q/PUBLIC#1111222233334444#1015#170#726995 and applies SHA256 to it, to create:

621d3dd5a66277a7ab3737f306728e3c4bc5f3cd20c8730c37cc61c6575de0ba

This is stored in a database and then forwarded to the bank with the timestamp, so it’ll look like this:

987654321#621d3dd5a66277a7ab3737f306728e3c4bc5f3cd20c8730c37cc61c6575de0ba#15.09#1426615839

So the bank will be presented with a Customer ID, SHA-256, they’ll have the total dollar amount, and they’ll have Epoch time, or the number of seconds from 00:00:00 UTC, January 1, 1970. This could be easily done by a Linux kernel by the output of date -j -f “%a %b %d %T %Z %Y” “date” “+%s”

The bank would then have everything they need, they’d have the secret key, which with the Epoch time from the transaction would give them the TOTP calculation, which would generate the answer 726995. Then they’d have the card details from the customer ID, the SHA-256, and the amount. They could then calculate the hash on their own:

621d3dd5a66277a7ab3737f306728e3c4bc5f3cd20c8730c37cc61c6575de0ba

And authorize the transaction.

Even if the card details were stolen by someone copying the numbers off the card, they wouldn’t get the TOTP secret. Plus the TOTP secret is changing every 30 seconds. If someone tried to run this transaction and guessed at the TOTP code, they’d generate this:
987654321#a1b714fba988632200c78a5b9021bca5b48f149b036aa901c03173f0f2de5399#15.09#14266158 and the bank would instantly detect this incorrect SHA hash and cancel the card and ship a new one.

This is rather involved but the practical upshot is, if a vendor kept these transactions in a database and someone stole the database to use for their own nefarious needs, the presence of the TOTP and SHA-256 would make the data in the database worthless because the TOTP has no predictable pattern if you don’t know the secret, and SHA-256 is very sensitive to even the smallest change in the input data that it’s hashing. This would free vendors, banks, and customers from risking PII leakage or identity theft.

I’ve also thought that this would be a great way to secure SSN’s as well for use with the government, they know your SSN and you know your SSN, so when communicating over a possibly compromised channel you can authenticate not with your SSN, but with the hash of your SSN.

John Q. Public, 123-45-6789 -> 01a54629efb952287e554eb23ef69c52097a75aecc0e3a93ca0855ab6d7a31a0

iOS 7 and iMessage

After I upgraded to iOS 7 on my iPhone 5 I ran into a really annoying problem. Whenever I would send iMessage messages to friends and family the message would look like it’s sending and then the progress bar along the top would stop about 1/4 inch from the end and just stay there for hours. Never sending the message. I tried the Kung-Fu Grip to only partial avail. The solution is to reset the network settings on the iPhone:

Tap Settings
Tap General
Tap Reset
Tap Reset Network Settings

Once the phone resets, and you reset your Wifi and turn on all the cellular bits, like voice and data roaming (at least for me) then after that, everything works as it should.

If this helps you, please let me know. 🙂

Private Dancer

Several days ago, while pondering an issue we’ve had at work an epiphany struck me. The problem we ran into was that our local network is a box of question marks. We don’t really know how it’s assembled or really what the rules are for using it, we just plug cables into wall jacks and if things work, they work. Until they don’t.

Enter NetInstall and NetRestore. These are the two imaging technologies for Macintosh and I’ve assigned my coworker to explore and develop images. Frankly he self-started it and I encouraged his exploration. We tried it first and both actions use a lot of bandwidth on the network and we eventually ran into a lot of problems. Not only did the machine we were working on take forever but it bogged down the server and caused huge headaches for everyone. We came to the conclusion that our local network just isn’t designed to carry any payload of appreciable size. It’s not really a complaint, but more of a characterization. It’s kind of fragile and wimpy.

So, was there a way we could still use ethernet technology without having to depend on our “provided” fragile and weak network? I sat in my chair pondering all of it, knocking some options out of the park instantly because of the machines we have. We can’t really depend on IP-over-Firewire as we have plain-jane MacBooks in the mix, they don’t have FireWire ports, just ethernet ones. As I looked across the way at all the server technology I had in the rack it struck me, each one, including the lowly Drobo had two Ethernet ports. Huh. Two. Only one was really being used to connect each machine to the network so each one had an available secondary port available. I then started to root around in my junk bin and found an old unused Netgear ethernet switch, five ports model, no fuss, no muss. I then grabbed a gaggle of short ethernet cables and started hooking all my servers and such to this little spare switch. Everything worked out magnificently well. In each server I configured these ports to conform to 192.168.0.* and assigned manual IP addresses for each of them. Then I found a unused Apple Express Wifi Access Point, plugged it in, set it for bridge mode and now I can extend this custom network into Wifi using 802.11N which is nice and fast. Just like that, cake and eat it too! What’s great about this setup is that my coworker and I can move large batches of data all over between these machines without having to worry about clogging up the network for all the other users who are trying to use these servers for their real work. Their files are small and their use sporadic, our use is large and nearly (sometimes) constant. The parts are just a few more blinking lights in the rack and just a little bit more spaghetti wiring hither and yon, but I don’t care, it works and it was free with the parts I already had on hand. The only part of all of this that upsets me is that I didn’t think to do it sooner. I suppose I should take some solace that it’s better late than never. Having this private access to all the systems makes both of our lives much better. We don’t have to complain to central networking anymore because we’ve abandoned their fragile wimpy thing for a far better solution in-house, and because it’s unroutable, we didn’t break one single rule, mind we don’t know what the rules are, but still. 🙂

It’s a good Friday.

FitBit

My FitBit failed yesterday. The poor thing no longer wants to work and when I plugged it into the charging base it just started to flicker with half the display actually showing. It was just another nail in the coffin that was yesterday and I was only half-expecting FitBit to honor their warranty because I doubted I had my receipt.

As it turns out, I started rolling up my receipts and storing them in a little collection box in my kitchen. I opened the drawer last night and rooted around for it, not expecting to find it. Not only did I find it, but I also discovered to my chagrin that when I purchased the FitBit I also purchased the “Performance Guarantee” with Best Buy for $15, which covers the device for two years and expires on 8/28/2014. So not only do I not have to muck about with the warranty procedure and wait for shipping and processing, but I can get a new unit as soon as I can get myself down to Best Buy.

Now I know what I’m doing for lunch. 🙂

Help Yourself

I have to admit to really enjoying the web service IFTTT. The service stands for If This, Then That. It allows you to create recipes from a menu of popular services where there is a public API available and move data back and forth not according to anyones design but your own, with IFTTT’s help, of course.

A great practical example is Twitter. On Twitter there is an account, MichiganDOT that is the public twitter mouthpiece for Michigan’s Department of Transportation, those folks responsible for the roads and rails and such. This twitter resource is valuable for many reasons the least of which is that MichiganDOT tweets about road hazard conditions and the presence of crashes or construction that would otherwise hamper movement within the Mitten. On its own Twitter is something that you have to grope for, you’ve got to start an app and page around to find what you are after and it’s all very manual — and annoying. I hate annoying. So how can you beat MichiganDOT, for example, into a service that sends you alerts? IFTTT.

The recipe in IFTTT to make this work is clever if you know the way to run around the back-end of Twitter. Several months ago Twitter closed their API to IFTTT making it difficult to create any new IFTTT recipes that use Twitter data to do automatic things. Twitter left a back door open, in that every Twitter account has an undocumented RSS feed associated with it, and all you need to know is the trick to get at it. IFTTT can consume RSS data, Twitter produces RSS data, so it’s kismet. The code you start with is this:

http://api.twitter.com/1/statuses/user_timeline.rss?screen_name=michigandot

This plugs into IFTTT’s Feed source, then you connect that to IFTTT’s SMS destination, set it to your mobile phone number and the recipe is done! Just like that. Really easy and straightforward and now the very moment that anyone who staffs the MichiganDOT twitter account posts ANYTHING the RSS link lights up, IFTTT notices, copies it over to an SMS message and ships it out to my phone.

With the undocumented API backdoor from Twitter, MichiganDOT, and IFTTT I am able to recast the MichiganDOT twitter account as a “Michigan Road Conditions Alert Line” and I don’t need to sign up for anything or ask anyone for anything or cajole some developer to make something to make it work for me. In many ways, it is a clever way to have my cake and eat it too. I don’t have to schlep around in Twitter missing things, I get alerts, bam, as they happen.

The nice thing about IFTTT is it’s just the tip of the iceberg. You can send any channel data anywhere you want. Twitter to Evernote, Twitter to Pocket, Facebook to Evernote, Facebook to Pocket… there are about 20 channels you can fiddle around with and you can shop around for other people’s recipes and adopt them and make them work for you. If you don’t have IFTTT, then you are missing out on a huge potential of DIY convenience. The best part is, nobody is the wiser. MichiganDOT has no notion, Twitter doesn’t care, so why not use what’s out there and make it work for you?

Google Drive Failure

Google Drive is a failure.

Google Drive was released yesterday, and I clicked the button on the website letting Google know I was interested in their product. I received an email late last night informing me that my Google Drive was ready. This morning, on a lark really, I went to the Google Drive website and clicked on the download link for the sync application to add to my work iMac. I downloaded the DMG fie without a problem and opened it up. I copied the Google Drive app to my Applications folder, like you are supposed to with Macintosh, and then I sat back and marveled at it. Google Drive, finally.

I’ve been a loyal Dropbox customer for years and back in January I sprang for the $100 a year expansion of my Dropbox up to 50GB. Everything I use connects to my Dropbox via the Dropbox API and just for the record, I am totally in-love with Dropbox. There is no reason for me to leave them as a customer. But even if you are loyal, it doesn’t mean you can’t explore. I have a professional account with Box.com through my work, and we arranged that after drop.io was consumed by the wraiths at Facebook. I have a personal Box.net account with 50GB but I don’t use it because Box only allows sync with paid accounts, so it’s not worth my while. Google Drive was just along these lines, just another option to look into.

So I started Google Drive on my iMac and I was asked to authenticate, something I expected. Then nothing. I started the app again and nothing. I opened up the Console app and here is what I found:

4/25/12 7:17:44 AM Google Drive[22481] *** __NSAutoreleaseNoPool(): Object 0x2e2ba80 of class OC_PythonString autoreleased with no pool in place – just leaking

4/25/12 7:17:44 AM Google Drive[22481] *** __NSAutoreleaseNoPool(): Object 0x2e37440 of class OC_PythonString autoreleased with no pool in place – just leaking

4/25/12 7:17:44 AM Google Drive[22481] *** __NSAutoreleaseNoPool(): Object 0x2e332f0 of class NSCFString autoreleased with no pool in place – just leaking

4/25/12 7:17:44 AM Google Drive[22481] *** __NSAutoreleaseNoPool(): Object 0x2e32600 of class NSCFString autoreleased with no pool in place – just leaking

4/25/12 7:17:45 AM [0x0–0x221c21a].com.google.GoogleDrive[22481] 2012–04–25 07:17:45.119 Google Drive Icon Helper[22488:903] Inject result: 0

So, it’s broken. This isn’t the first time a new app was built that failed horribly on my iMac. If anyone cares, and perhaps if anyone from Google is reading, this is a standard 2009–2010 iMac running Mac OSX 10.6.8. The only thing different about this particular Mac is that the account has it’s home on an AFP-connected OD-domain’ed Apple xServer. A network home. This causes headaches for Adobe Acrobat Reader so it’s probably the reason why Google Drive collapses on startup.

Since I can’t run the application, and since it wasn’t designed elegantly to take into account those people who have network-based computers like mine – unlike Box.com’s sync app or Dropboxes sync app, I can only state that Google Drive is not ready for prime time. Google Drive is not ready to compete in the marketplace and Google has to go back to the drawing board and try again.

Abandoning Google Plus

Yesterday I opened my Google Plus page and discovered to my surprise and initial pleasure that Google had brought a new interface to their social network system. As I started to explore this new interface I started to immediately notice that things had changed not for the better, but rather for the worse. Google had unilaterally included their chat system on the right side of my browser window, it’s something I rarely ever use so that system is all wasted space. I noticed that the stories in my circles, the things I really care about are now shuffled off to the left in a column that lost 10% of space on the leftmost and 50% on the rightmost, being moved over for some controls at the very top of the page that now occupy this dreaded whitespace region on my Google Plus page.

It’s this whitespace, and the meaningless chat talker system that I can’t stand. Facebook attempted a similar move by presenting me with a chat-talker screen on the left side as well months ago, when I still used Facebook. When they made the changes to their interface, along with privacy concerns and workplace issues with social networking I left Facebook. Now it just languishes as an identity marker, if content gets on my Facebook page it’s wholly accidental. Twitter’s web page also underwent this columnar approach, as they reconfigured the entire interface out from underneath their users. For Twitter, I stopped using that because it was more noisy than useful, the people I wanted to engage with were just human billboards, and the interface changes were really the straw that broke the camel’s back.

So what is there to do? Complaints about the interface changes are really the only channel you have to express how much you dislike when a service does this to you – but you have no real power. Just complaining is one easily ignored tiny little voice in the darkness and doesn’t amount to anything at all. The only real power that any single user has is the power of choice. In the end, the only choice I have to make is, do I want to still use the system? It’s actually a matter of abandonment. I abandoned Facebook. I abandoned Twitter. Because they changed the interface and made it less useful to me, I am facing the idea of abandoning Google Plus. I don’t need these social network systems to give my life meaning. They need me, or rather, they need aggregate me’s, lots of people, to give what they do meaning. The less people use a socially networked system the less appealing that system is to everyone else. Facebook is only compelling because everyone uses it. There is no real value inherent in Facebook itself. This is a lesson that the classic business models these companies use can’t take into account – that their popularity defines their success. If they make a grossly unpopular change to the interface, then people will flee and their success will go tits up.

I don’t care to encourage other people to abandon these systems if they like them. Each of us has to make these kinds of decisions on a wholly personal level. I find it obnoxious that Google, and Facebook, and Twitter for that matter all force interface changes on users without giving the user any control whatsoever. It would be more elegant if there were a batch of controls we could select from and build our own interface. Put the bits and pieces where we want, opt out of things we don’t care for and make the interface work best for us, as the users. None of these sites have done that, they all behave as if they have global fiat to make changes willy-nilly. The end user who has to contend with these changes can’t do anything really except make that singular choice surrounding the issue of abandonment.

So where do I go now? It’s comic, but in many ways I am looking forward to going backwards. There is one system that I’ve used, mostly as a category but the people behind what I currently use I regard as being the platonic form of that category, and that is WordPress. Going back to blogging. What does the WordPress infrastructure have that attracts me? It’s got stable themes, the site looks very much like it always has. There are changes, but they aren’t as gross in scope as these other systems have perpetrated. I can share links on WordPress, I can write long posts, short status updates, and WordPress has a competent comment system already in place.

So I will give Google Plus until May 1st to do something better with their interface, to recognize the value in the stream and give us users the choice of what systems we want to see on our Google Plus page. Google should give us the ability to turn off the whitespace region, we should be able to turn off the chat talker region, so that we can maximize the stream region. If they fail to correct these glaring human interface deficits I will do to Google Plus what I did to Facebook. I will abandon Google Plus. I will keep the account running but I will no longer actively use it. Things that end up on Google Plus will end up being the same sort of things that end up washing up on Twitter, specifically links to content on my WordPress blog. Google’s loss will be WordPress’ gain. WordPress has always done right by me, and I respect them. I do not respect Twitter, nor do I respect Facebook. My respect for Google is quixotic at best. I used to believe in their “Do No Evil” company mantra, but that has been shed as Google has done some very evil acts, they aren’t what they once were and this sullying of their image makes the pending abandonment easy.

Will my abandonment hurt Google? No, of course not. I’m not so full of myself as to think that me leaving will change anything about the service, that Google will even notice my absence. However if I can inspire other people to give another look at WordPress, maybe see that progress forward can be achieved by regressing to earlier systems may be a worthy pursuit if what you get in the trade is interface stability. That this single raindrop encourages others to fall. The raindrop doesn’t believe it is responsible for the flood. I can only hope that I help the flood along. These massive changes that these social network sites perpetrate on their usership should be punished! We want it all, we want to use the service and we want to control it as well. We want the interface to be regular, logical, useful and static. When we want to make a change, we want to be the ones making it. We do not want to be victims of someones good intentions, Google! I would say this for Facebook as well, but that’s a lost cause.

So time is ticking away. If Google does not act, then the stream on that service is terminal. If that comes to pass, I will be migrating to my WordPress blog.

I hope to see some of you there.

Flashback Trojan on Mac OSX

Apple makes some marvelous products. In this case, I’m talking about Apple Remote Desktop. With ARD I was able to scan every single one of my client Macs to check to see if any of them were infected with the Flashback Trojan Horse. Before my scan I would have sworn on whatever-you-like that none of my systems that I manage here at WMU were infected. Turns out I was right.

Macs really aren’t susceptible to viruses and the biggest threat comes from Trojan Horses. To scan a mac for infection you just open up Terminal and run these two commands:

  • defaults read /Applications/Safari.app/Contents/Info LSEnvironment
  • defaults read ~/.MacOSX/environment DYLD_INSERT_LIBRARIES

If you get an error from both of those commands, you are in the clear. It’s quite easy to do, mostly just opening up Terminal and copying and pasting and getting the errors and being satisfied. The removal instructions are straightforward to follow, so even removal of an active infection should be a snap.

If you try these commands and don’t get errors, don’t panic. Just let me know and I’ll find a way to help you out.

Tearing Down

While doing the usual weekend chores last Sunday I bumped the vacuum cleaner into the table where my old computer and desktop used to be. Ever since my iPad and iPhone the location and nature of much of my computing tasks at home have radically shifted. I no longer spend such long hours sitting in front of a huge machine playing online games. Now I just use my phone to tend to email and read for the most part.

This change in how I use technology isn’t reflected in this room upstairs in my house that has for the most part been neglected. So there needs to be a reckoning. I need to sort through this area and pitch what has to be removed and generally de-clutter that part of my house. It feels a lot like a callus that has built up over time and it’s a kind of clutter that you don’t really see any more except when you run the vacuum cleaner into it. There is a general sense of simplification that appeals to me and this table full of wasted technology needs to be figured out.

Along with this I have five closet areas that need to be generally gunged. There is a coat closet on the ground floor that needs to be seriously organized, the guest room closet needs to be exhumed and dealt with, then all the upstairs closets need to be gone through. There are things I no longer need, want, or can use. Clothing, knick-knacks, and various orders of past debris that all need to be evaluated and sorted and organized.

This weekend I think will be a fantastic opportunity to address these situations, at least for as much as I can do on my own. We’ll see just how much progress I can make.

Spinning Governor

I’ve come up with ways to cope with the network connection throttle that I recently discovered was behind a lot of my network woes here at work. In my regularly scheduled workaday use of the Internet I usually find myself consuming at least 150 connections if not more because everything I use was built with the assumption that establishing multiple connections is free and easy. There is no parsimony when it comes to using the network, and you see this exemplified most of all in the design of browsers like Firefox. When you fetch a page, most modern browsers will attempt to also-fetch possible pages you may want so that they can appear faster. This is fine if you have an unlimited number of connections that you can make to the network. That isn’t the case here.

I can live with the throttle. I understand why it’s in place and knowing that it exists helps in that it keeps me from questioning my sanity when I didn’t know it existed and thought the problem was with me or my computer. It’s neither. So there are some ways to address my problem. Specifically the route to a better life is ironically through the same devices that are at the center of the entire ‘running out of IP space’ problem, iOS devices. My iPhone and iPad have apps that can bring me interfaces to Internet resources that I need to use, and they can free up my computer so that I can help avoid the connection quota throttle. For example, instead of opening up Toodledo in Safari I can open up the Toodledo app on my iPhone. Different device, different connection quota. My iPhone doesn’t make so many connections and if I did need that feature I could very easily drop wifi and use the 3G data circuit. I can do a lot of other things too, like manipulate Asana, run my eMail through my iPad, that sort of thing.

So, in a way, the connection throttle has shifted the load from one device to three. At first this was kind of a pain in the ass, but over time I’ve come to see that this could become more efficient. It frees my computer up for the heavier things, like Google Reader and such. We’ll have to see how it goes.