Journal Migration

Just as I had migrated from Day One to Evernote, I got really tired of Evernote and it’s bloated sluggishness. So I moved my Journal again to an app platform linked to the Bear app on iOS. This application is really quite useful and a joy to use on my Macbook. I tried to sync my journal to my iOS devices, but I had less luck with that. I am however getting pretty top-notch support from the people who write the app, so for that, it’s working out really well. I can use the platform with hope of a app fix for what ails my journal, as long as that hope lasts.

One of the most compelling parts of the Bear app is its tagging platform. It’s almost the perfect thing, but as I wrote to Evernote as a enhancement request, I would like tags to be optionally indexes as well. What I mean by this is when I make a tag, that there be an optional checkbox or slider for making it an index entry. So when I create a new index entry, the software scans the content of my journal for that tag and if it finds it, adds the tag to the entry. So far, I haven’t found any apps that do that and sync across devices. But so far, Bear is nice to work with. If you are interested in seeing what it is like, you can get it for free from the Mac App Store.

Walking Down Memory Lane

Some notable events from other July 1st’s

2003 – Installed a network aware fax machine, and then attached it to Groupwise. My god, Groupwise. This is such a walk down memory lane! And this of course was the first of a repeated meme that online shared mailboxes at work are upsetting to people because they aren’t “private”, in the same way that a regular fax machine is “private” by hovering over it and muscling out anyone who might try to use it. It of course begs the question, what are you transmitting at work that is “private”, that you shouldn’t be doing at say, a FedEx shop or Office Depot?

2003 – Toppenish, Washington was in the news because a keyword blocker at a library got upset because it found something it didn’t approve of in the text of the domain name itself. Nowadays we don’t search domains for text fragments, we actually categorize them.

2004 – Again with the Fax Machine. In this case, not having long distance on the line requiring the use of an AT&T calling card, with a 60-digit calling sequence just to send a fax far away. And the merry mixups when people who work for an Institution for Higher Learning demonstrate no higher learning by being unable to comprehend digits. Ah, those were the days.

2004 – Farhenheit 9/11 – Hah, those were the days, weren’t they? When it only felt like scandals were rare and maybe all the crazy conspiracy theories were just theories. Oh, the memories.

2006 – Sharing the photos of the bathroom rebuild. It was a long while ago that we tore the guts out of that bathroom and updated it.

2007 – At O’Hare, running through security, on my way to visit family in Syracuse.

2008 – Another trip to Syracuse. This time through Detroit.

2009 – The problem with the cloud is poor security and access points everywhere. What happens when people plant incriminating evidence via a route, like junk mail, that you pay very little attention to – and then make an anonymous tip about the evidence? It was an interesting consideration and helps reinforce how important it is to keep everything digital tidy.

2013 – I wrote a lot of things about the security threat that our very own NSA represents. And little did he know that in 2017, the tools they collected and wrote would leak out and turn into WannaCry ransomware attack. Thanks NSA!

2015 – Facebook Notifications get an enhancement and they can accept a GPG Public Key, so all the Facebook Notifications over email are all encrypted. This was a really good proof-of-concept option from one of the worlds biggest Internet sites, alas it won’t ever take off because GPG is an all-or-nothing technology, and since you aren’t going to have all, all you get is nothing. It was this day that I also gave a lot more thought to The Golden Rule and started to reshape my life around it as a moral compass.

 

DayOne 2.0 to Evernote Migration

Years ago I started to write a personal journal. Mostly the application I used was Microsoft Word, sometimes other text editors, and I’ve always been on the search for a better way to conduct my journaling habit. When I started using Apple’s Mac, I discovered Bloom Built’s DayOne journaling software. DayOne came highly recommended on many pro-Mac websites, so I bought in. Everything was going fine with journaling for a while, and then I noticed that the authors of DayOne were going to release DayOne 2.0. I eagerly jumped onboard with this update and forged ahead. The feature set was welcome, multiple journals, a more refined syncing experience with an online service run by the manufacturer (turned out to be AWS), and I was rather happy. The new syncing system also came with extensions for one of my favorite online time-savers, IFTTT. What more could I ask for?

Then the updates started. I noticed in my Mac App Store application that there was a listed update for DayOne 2.0, and I clicked on it, and the system acted as if there was no update present. A deviation from the expected behavior that I thought was some glitch or bug. So I dived into the problem and searched Google for hints. There were so many options in the Google Index that I figured one of them should solve my problem. In the end, I had completely roto-rootered the entire Mac App Store. I tried one last time, and still, DayOne 2.0 refused to update. I gave up, figuring that was something that maybe a reinstallation of the Operating System would solve because it was a behavior that was unexpected and this sort of thing doesn’t happen with Apple products in my common experience. So then, resistant to being beaten by a bug I forced the issue with the App Store and tried to download the update to DayOne 2.0. I discovered to my chagrin that the update required the next edition of the Mac OS, El Capitan. I have a vested interest in staying with Yosemite; I’m happy with my MacBook using Yosemite, so why should I upgrade the OS to satisfy an application?

The next injury came shortly after that. While using DayOne 2.0, I was rather miserable since the software acted so downright sluggish. I would type, and the application would just pinwheel or pause, and then in a blur, all my words would spill into the display with the same rate at which I type. I wasn’t getting the instant response to keyboard actions that I was expecting. I verified that other applications behaved properly, TextWrangler, for example, behaves perfectly fine to my expectations, so this isn’t a system problem, it’s a DayOne 2.0 problem. Previously to this, I had sprung for a copy of Little Snitch on my Mac to help me better control my network interfaces. Little Snitch had options to block an application from accessing the network. So on a lark, I figured I would test the sluggish application, DayOne 2.0 by blocking its network access with Little Snitch. It was like turning a lightswitch! The sync component was broken, showing a red exclamation mark, but man, text entry was back to normal, and tag entry was super quick as well. I didn’t have to wait for pinwheel after pinwheel to get where I was going. I wanted to journal, to get my text entered into the system for safekeeping and remembering. So for a while, I would use Little Snitch to damage DayOne 2.0 so I could use the application the way I wanted to, the way I expected to. I then wrote to Bloom Built and asked them if they would update the application for the users who didn’t want to march forward with El Capitan or Sierra, and declined. It was a longshot, but I figured it was in their best interest to address their application to the largest group of users, and that would presumably mean even people using Yosemite. It wasn’t to be.

So then after thinking about it for a while, and growing weary of the rather extended procedure to get Little Snitch to help me block DayOne 2.0’s broken sync routines, I made the fateful decision to switch my journaling to Evernote. Why Evernote? Because it was on all my devices, just like DayOne 2.0 (at least Mac devices), and Evernote already had integrations with IFTTT, so that was set. Evernote was something I knew, and the Evernote syncing routines were significantly better than DayOne’s syncing routines. Saying that has to be tempered by the fact that sometimes Evernote’s syncing routines also break, but the one-off hard-to-diagnose sync error is better than a broken sync routine that throws pinwheels when you type text or try to enter tags, as it is with DayOne 2.0. Evernote also has one extra feature, which wasn’t a part of the decision but now that I’ve made the switch, I’m glad for, and that is you can highlight text in Evernote and encrypt it using AES. This is something that DayOne 2.0 had as a promise, but they were by all appearances dragging their heels when it came to journal security.

I then started writing all my new journal entries in Evernote. That was very straightforward. However I left about 11,000 entries behind in DayOne 2.0. I started looking at the ways to get that data out of DayOne 2.0. There are a few options, the creation of text data, PDF data, HTML data, or JSON data. So I started extracting entries out of my DayOne 2.0 journal trying to import them into Evernote. What I wanted was individual entries to move over to Evernote and be individual entries there as well. Everything that comes out of the exporter in DayOne 2.0 comes out as chunks. One big HTML file, one big PDF file, one big JSON file, and one big Text file. There is no easy way to get individual entries out one-at-a-time unless you wanted to manually slog through every single entry. At 11,000 entries, that wasn’t going to happen. I have no patience for that. So then I started to look at ways to hack my DayOne 2.0 exports, since the people that wrote DayOne 2.0 didn’t have anything helpful, and all the other tools I found online were solely written for DayOne 1.0, something I couldn’t use. I didn’t have a Journal.dayone file, I had an AWS hosted JSON chunk. So the hackathon commenced. HTML was a giant headache, since there isn’t any way to easily split HTML up into chunks, syntactically speaking, at least not with the data that DayOne 2.0 exports. The PDF was a mess, one immense PDF and the text was in 8-point, it’d be fine if I was 20 years old, and didn’t mind slogging through a monolithic PDF file for a date. I even tried to hack around JSON in my limited way. I got JSON out to CSV but then realized that my instinct to make the CSV a data source for a mail-merge and mail-merge my journal out to individual entries was going to be a bust. Macs don’t do mail merge at all. I made peace with that a long while ago, not that I ever had any work that needed mail merge. So there was only one format left, the most basic format, text.

DayOne 2.0 spits out a journal into one monolithic text export file. So I have to figure out how to hack this text file up into pieces. I spent a long while with the bash terminal, screwing around with csplit and discovering the subtle differences between Apple’s implementation of csplit and GNU’s implementation of csplit. After a night of blind hacking, I gave up on csplit. Of course, by this time I had also given up on DayOne 2.0, it wasn’t the application I wanted anymore. My feelings had soured against the manufacturer, for only going so far with their export code and leaving the rest for me to hack out on my own. I was irritated and felt gypped that they didn’t just go one step further and include an “export individual entries” checkbox somewhere. But I got over my funk; I burned that bridge there was no reason to keep on complaining about it. I was moving to Evernote and Bloom Built was pretty much post-fire, all sad ashes soaked with water. Nights of searching and hacking on this monolithic text file and I eventually found the solution. The first step comes with Perl:

#!/usr/bin/perl

undef $/;
$_ = <>;
$n = 0;

for $match (split(/Date:\t/)) {
open(O, ‘>temp’ . ++$n);
print O $match;
close(O);
}

This little script is something I found through Google. I’m far too lazy to hack this out on my own if I’m brutally honest. The keyword in DayOne 2.0 entries in this monolithic text file is “Date:” followed by a tab character. Every entry starts with this key. So, export my DayOne 2.0 journal to Journal.txt, and then run this script against it: ./split.pl Journal.txt. Perl tears the file into perfect chunks ready for action. But the files are temp001, temp002, temp003, so on and so forth. Two lines then add the last crowning bits to each file. The first tacks on a txt extension and the second one grabs the first line of each file and makes that line the new filename. In DayOne 2.0, the first line is the date line. So now my entries have their dates as their filenames. This is just a compromise, I would have much preferred to have the dates preserved in the file metadata, but hey, you get what you get:

for f in temp*; do mv $f $f.txt;done
for f in temp*; do mv $f “$(head -n 1 $f).txt”;done

So for my test journal, I exported from DayOne 2.0 into Text, chopped it all up using Perl, and used the bash shell to hack the names to where I was happy. Then lasso the entire batch of files and drag them into Evernote. Once I had this done for all my journals, I closed DayOne 2.0 and left it alone. There is no point in trashing it, let it dwell on in a ghostly non-life for all I care. Evernote at least has proper behavior when it comes to text entry, tag entry, and the syncing routines are better. Plus Evernote will never abandon me the way Bloom Built did. They’ll never stop updating Evernote for Yosemite, or if they do, it’ll be so far down the pike that I get a new laptop and all of this is just so much foolish wrangling anyways.

In the end, I won. I won against an annoying choice made by a company I used to love; I won against a file format that seems so dumb, and I was able to shoehorn years of journaling without having to abandon my past or make it so annoyingly inaccessible that it would be the same as abandoning it.

If you find an interest in switching from DayOne 2.0 to Evernote, this is one way to do it. There may be better ways, clever ways to convert the JSON to the Evernote import file format, perhaps. But I didn’t care enough to slog through JSON, this is my way, in all its dumb glory. Of course, my tags in DayOne 2.0 are shot, and the tagging in Evernote is a manual affair, so that was another little compromise. Perhaps if I have dull weekends or evenings, I can hack out the tags over time. Having the entries and losing the tags is an acceptable loss. At least I no longer need to force Little Snitch to break DayOne 2.0 so I can use it. Heh, that’s still something that makes me shake my head in disbelief. That you have to do it this way is such a mess.

New Year Resolutions

This new year I resolved to be done with Twitter, Facebook, and Reddit. I had abandoned Twitter a long time ago, Reddit was easy as I was never really invested in that platform anyways, and then most recently leaving Facebook behind.

It needs a little characterization. I haven’t deleted my Facebook account, but what I have done is ceased to engage on that platform. I still do check in once a week just to mop up my timeline recommendations from people putting my name on their posts and otherwise just establishing a heartbeat there so that the people who are on the service and follow me notice that I still live. I suppose that eventually even bothering with the heartbeat updates will grow tiresome and I’ll give up on that as well.

I have instead moved my entire social networking existence to a new service called Imzy. It’s at imzy.com, and I encourage everyone to join me there. There are some pretty good AUP rules in place and communities can also have extended rules, building off the core AUP of the site itself. Imzy is a perfect place to have real discussions with people online. There is a culture in Imzy which I haven’t found anywhere else. It’s this lack of trolling that I witnessed and it’s what led me to dump Facebook.

I don’t know what this means for this blog. Imzy is a great platform all on its own, and when it comes to blogging, my user community has a lot of features that my blog can’t meet. The sense of community I think is what is missing from a lot of services, and my blog. This service is mostly just a billboard for me to yell at the darkness. There aren’t any real conversations going on here, unlike in Imzy.

I figure if I don’t post more blog entries I may just archive all of this stuff and shutter the service completely. Then again, I may just be lazy and let this blog long-tail it to eternity. Only time will tell.

Assert The Win

Sometimes it’s the best thing to assert you win and walk away from a toxic problem. So far today I’ve done that quite a bit. What have I abandoned?

I’ve walked away from Facebook. It’s been four days since I even logged into Facebook and since then I haven’t missed it. I’ve been catching up on my news; the Spiceworks Community board consumes a lot of time. Then after that, I turned my attention to my Pocket list. There just isn’t enough time anymore to deal with Facebook. When I logged into it, I had eighteen notifications, and I frowned and realized that I didn’t care that much. I’m writing a lot of my thoughts into my journal after coming to the realization that sharing with others isn’t going to be a positive experience. Now nearly everything on Facebook is an unpleasant experience. So, abandoning toxic things seems to be a good thing for me.

Another toxic system is Office365. Microsoft and I go back for a long while, right along with my almost palpable hate for that company and their products. Going into just how Office365 lets me down is very dull. Nearly every interaction has me wishing I could just close my laptop, put it in my backpack and run away from my life. Everything that has some Microsoft technology associated with it has me frowning in deep disappointment. Alas, there is no way to escape the Great Beast of Redmond, so we gnash our teeth and endure the horrors.

The final horror is WordPress itself. I use a stock theme, Twenty-Twelve. It’s not a custom theme. It’s not slick or responsive. It’s just a dumb theme. So while reading my blog, I realized just how much I wanted to change the line-spacing for my post entries. This is where my expectations fork, there is an Apple fork and an “Everything Else” fork. The Apple fork has been proven time and time again, that the answer is simple and shallow and easy to get to, understand what the change will do, and make it work. Then there is everything else. Here we have WordPress itself. I wanted to change the line-spacing on my theme. So I go to the Dashboard, and I spend ten minutes blindly stabbing at possible places where this option might be hiding to no effect. Then I do a Google search, which is the first and last place that most possible solutions are born and die. A good Google search almost always results in the answer you are after. So, “WordPress vertical line spacing” led to a place that eventually had the solution in it, but the theme didn’t match what I was expecting. This is the core of frustration, so I modified the search to include the themes name itself, and that helped. I found the setting, and it was in a CSS stylesheet file. I left the WWW when it was still HTML only. CSS irritates me. But anyways, hack CSS, that’s the answer. It’s a dumb answer, but that’s it. So I find about 130 places where line-height is an option. I laugh bitterly at the number. Which section to edit? Are you sure? So I gave it a shot. I set the line-height to 2.0 and then looked at my site. I can’t tell if it improved or not. But the most adaptive solution is to assert it did what I wanted. Mark the win as a notch and move on. Do I care? Well, I wanted to do something. I did something. Did it work? Probably not.

But then we get back to that first fork. That’s why I love Apple so much. Nearly everything they touch MAKES SENSE. I don’t have to struggle with some labyrinthine mystery. Maybe my edits will work, maybe they will break whatever it is, maybe it won’t matter. Maybe any setting I change will be overridden somewhere else, by something that was never documented. That’s the core design principle of both WordPress and Microsoft. I suppose we should just be happy that the most basic functions work. Much like the Internet itself, the fact that any of this works is a daily miracle.

So instead of writing a huge rant, one that nobody wants to read and nobody cares about I will assert that I won, psychologically move forward and be able to forget the conditions that led me to those particular experiences. The blog doesn’t work like you want? Don’t go there. Facebook a cesspool of ugly humanity? Skip it. Microsoft? Ah, if only it would burn to the ground. But we can’t have what we wish, even if we’d do anything for our dreams to come true.

So! Hooray! A Win! Facebook, WordPress, Office365! Just stop worrying about the bomb. It’s “Someone Else’s Problem®”

Google Schmoogle

Today is a day for Google to let me down. Generally, a lot of technology companies end up in the same dustbin. They always promise some glittering awesomeness, but when you start to engage with them, you discover that the awesome is rather half-baked. In this particular case, the first two Google technologies were their Music Play property and Android.

Google Music, or Google Play, whatever it’s called, has a lot of the music that I uploaded from my iTunes when I still had music files that I used to play on my iPod. My musical use has migrated to streaming technology, specifically Spotify for which I am very pleased with. I often times miss my old iPod with my music loaded on it. There was something about the shuffle feature on my old iPod Nano that fascinated me. The old shuffler felt almost psychic or at least sensitive to my environment and conditions. I think it is because the device had its RNG on-device and it was a wearable device. There is something still there I think, and I think back on it fondly. A lot of my music is on Google Music, and today I thought I might uncork some of it. I opened my Safari browser and discovered that Google Music doesn’t work without Adobe Flash. As a general rule of thumb, I don’t use Adobe products at all if I can help it, and that is especially true of Adobe Flash. There was a point in the past where you could have installed HTML 5 on the Google Music site, but Google has since eliminated that option as far as I can tell. So, strike one for Google.

The next strike came when I tried to use my Samsung Galaxy Nook device. This device is loaded with Google’s Android operating system, and I’ve railed against this before. In this particular case, it is related somewhat to the dead horse I keep on beating in regards to Google Android. I had my Nook open, and I was trying to use it. The interface is sluggish as hell, but I have grown to accept that. There is an app I have on my Nook, it’s called “Clean Master” and it’s designed to be a system maintainer for Android. From my experience, paired up with what I’ve seen claimed by “Clean Master” application is that Android is a wet hot mess. Every time I use the app, it finds 350MB or more of “Junk files”, and does scans for “Obsolete APKs.” This scan takes an exceptionally long time. So I’ve fallen down a rabbit hole with the device, trying to get it “cleaned up” because it’s “dirty”. This application is dutifully chugging away, apparently just circling around the same batch of directories for about ten minutes accomplishing nothing. I tap the big button at the bottom. “STOP”. Nothing happens. I then tap it a few more times. “STOP”. “STOP”. “STOP”. In the end it was a comedy, and I started to mumble “STAHP” to the device. At the top of the application is another control that says “Advanced Settings” thinking maybe I could turn the scan for “Obsolete APKs” off. Nope. Tap, nothing, tap, nothing. Tap tap tap tap tap tap. The device stops working altogether and then boop, new screen and it’s back to working! But the options there are useless. So then I try to use the “Home” button, and the Nook just dwells there, thinking. about. it. Then the Home switcher screen appears, and I make the throwaway gesture to get rid of “Clean Master” app. There is “nothing” running on the device, but it’s mostly just sluggish as hell.

So that is what informs my opinions about these companies. Google, Samsung, and Apple. I include Apple because I have a lot of Apple devices, and they don’t behave like this. Even with two giant corporations working together, Google and Samsung, they can’t even touch on what Apple does. My iPhone 6 behaves for me, mostly, and in comparison, it is far better than what Samsung and Google bring to the table. My chief issue is the disconnect between the hardware stats, the Samsung is supposed to have more resources than the Apple products, so it comes down to the OS? It may simply be a fight between iOS and Android in the end. To really focus on my issue, it is all about user interrupt. On my iPhone, the user interrupt, which is to say the events that the user wishes take top priority. The interface is “snappy” and “gets my wishes” and “performs”. Whereas in Android, the user input seems to be treated like a queued wishlist that the user inputs and waits for the device to act on if it wants to, or not. I know it’s not designed to behave this way, or at least it shouldn’t. But the behavior is what informs my opinions. I’ve got an Apple device that is snappy and responsive to me versus a Samsung/Android Nook that seems to want to do its own thing. There is another company represented, and that’s B&N. Mostly at this point I think of B&N as a bystander. They aren’t really involved anymore with Samsung or Android, they’re just marketing books through a channel, and they happened to choose this channel. For what the Samsung Galaxy tablet is, it’s core function that I use it for, which is an eBook reader, it is satisfactory. For a general use tablet or a mobile device capable of more than just eBooks, though? No. And I can’t understand why people who use Android accept this behavior so blindly. Perhaps that’s what being a fan is all about. If you are fond of the underdog, the scrappy alley fighter, then I suppose Android has some romance to it. You want the sad, somewhat over-concussed street-fighter who sometimes pisses himself and forgets his name to come out on top in the end and win the day.

So with these two starting experiences today, the answer is to lower your expectations. I expected too much of Google and of Samsung. The device is just a simple eBook reader, it really can’t be anything else. I will never willfully purchase another Android device, so there isn’t any reason to declare that Android is dead to me, it was dead on arrival after all. The only thing that I can say is that other people seem to enjoy it, and in the end that’s all that matters. After seeing what this Samsung Galaxy can do, I don’t understand the why behind Android’s success, but they are successful and in that, well, that’s good. It’s just not for me.

As for the music, I again lower my expectations. Instead of searching for some way to access my Google Music without Adobe Flash, I’m instead going to try an application that can help me migrate my music collection off to a Spotify playlist, maybe. In that, I have very little faith, and I’ll probably just give up and stop thinking about it altogether. I find myself not really fighting about technology anymore. I find that I’m more apt just to turn it off, put it in a drawer and forget about it for a few decades. If I were a technology company, I would really love to find out what kind of technologies people have put in their drawers and forgotten about, and find out why. That would create a great laundry list of things “not to do” when devising new technologies.

Peer to Peer File Transfer, Reep.io

I recently needed to move about ten gigabytes of data from me to a friend and we used a new website service called reep.io. It’s quite a neat solution. It relies on a technology that has exists in many modern browsers, like Chrome, Firefox, and Opera called WebRTC.

The usual way to move such a large set of data from one place to another would probably best be mailing a USB memory stick or waiting to get together and then just sneaker-net the files from one place to another. The issue with a lot of online services that enable people to transfer files like this is that many of them are limited. Most of the online offerings cap out at around two gigabytes and then ask you to register either for a paid or free account to transfer more data. Services like Dropbox exist, but you need the storage space to create that public link to hand to your friend so they can download the data, plus it occupies the limited space in your Dropbox. With reep.io, there is no middleman. There are no limits. It’s browser to browser and secured by TLS. Is that a good thing? It’s better than nothing. The reason I don’t like any of the other services, even the free-to-use-please-register sites is because there is always this middleman irritation in the way, it’s inconvenient. Always having to be careful not to blow the limit on the transfer, or if it’s a large transfer like ten gigabytes, chopping up the data into whatever bite-sized chunk the service arbitrarily demands is very annoying.

To use this site, it’s dead simple. Visit reep.io, and then either click and drag the file you want to share or click on the File Add icon area to bring up a file open dialog box and find the file you want to share. Once set, the site generates a link that you can then send to anyone you wish to engage with a peer-to-peer file exchange. As long as you leave your browser running, the exchange will always work with that particular link. You don’t need any extra applications, and it works across platforms, so a Windows peer can send a file to a Mac client, for example. That there is no size limit is a huge value right there.

If you have a folder you want to share, you can ZIP it up and share that file. It’s easy to use, and because there are no middlemen, there aren’t any accounts to create, and thanks to TLS, nobody peeping over your shoulder.

Shifting Platforms

I go through cycles of having an interest, and then not having an interest in social media. Twitter and Facebook are the core services that I’m thinking about here. Amongst these services, I’ve given up on Twitter. I no longer engage with anyone in Twitter and the leading edge of loud, noisy chatter has carried on without me. If I do run the Twitter application, it’s mostly to witness some event as it unfolds, like a news source, or to jump on some shame bandwagon when a public figure makes a terrible mess of their lives by saying or doing something stupid.

I am about to give up on Facebook as well. There are many reasons for this renewed effort to leave the system. I am tired of the see-saw polarity between stories. The negative political stories mixed in with the positive reaffirming stories build up a kind of internal mental noise that clouds my day and keeps me from being focused. Another reason to leave is the interface has become somewhat moribund on its own. You can sometimes comment, sometimes not. The only option to express your reactions when it comes to feelings is “Like” and the entire service has become self-balkanized. I have friends and family on Facebook, but out of all of them, I only follow a few and I’ve muted the rest. I don’t really miss the engagement, but always having to think about tailoring my thoughts based on the audience has started to give me fatigue.

I think then that it may be time for me to go back to writing blog posts on my WordPress blog. The blog encourages longer format writing, and I expect that engagement will drop as I won’t be using Facebook. In a lot of ways, it is a kind of social addiction and the only way to break it is to wean off of it. Perhaps cold turkey is not right, but rather cool turkey.

I don’t expect anyone to follow me off of Facebook. I will share my blog posts to Facebook so people can still see what I write, but the engagement will drop off. Feel free to comment on my blog if you wish. Otherwise, that will be that.

On a more technical note, I changed how the stories are shared across systems. The original way was to publish a WordPress entry, which would share to Tumblr, and that would then share to Twitter and Facebook. I have torn that down and set it so that WordPress itself shares to Facebook, Google Plus, Tumblr, and Twitter. It’s a more direct path that doesn’t require people to slog through my Tumblr. I think it’s more direct this way.

Thanksgiving 2015

Tis the season for us to unpack all the holiday crazy that comes with the post-Halloween holiday adventure. Thanksgiving and Christmas. Cooking, planning, setting up, and a lot of decking of the halls!

So we start with Thanksgiving. Weeks ago we took advantage of the 50% discount deal at our local supermarket and made room for the frozen Turkey in our basement fridge. Then we slowly accumulated all the other ingredients to our “feeding an army for two people” style of Thanksgiving. On that Monday, November 23rd. I caught a little video from a television and network cooking personality, Mr. Alton Brown. He recommended that people could defrost and brine a turkey at the same time. So I had a frozen Turkey in my freezer and I had never brined a Turkey before and didn’t know how it would turn out. Following Mr. Browns advice, I hauled out the twenty-pound bird and found that my biggest stock pot fit it like a glove. The directions couldn’t have been more direct and simple. Strip the Turkey of it’s webbing and plastic wrap, then put a cup of Kosher Salt in the vessel along with 2L of hot tap water in the vessel and stir until the salt is dissolved. Then add 4L more cold water to the vessel and then put the turkey in. I put it so that the main cavity was pointed up at me, so as I added more water (water to fill all the way around the turkey) it wasn’t going into the cavity, so I poured into the cavity until the entire bird was submerged. Then I wrapped the top in plastic wrap and put it in the basement, behind locked doors. No refrigeration required! As the turkey defrosted itself, it also brined itself. When I temped out the bird two days later it was at about 45 degrees and then I stowed it in the fridge until we were ready to cook it. When I was set, I poured the water off and then rinsed it with fresh cold tap water, all the cavities and everything. Then I put it in the roasting pan.

The oven was set at 350 degrees, however, it was running hot for about twenty minutes, so the first shot was at about 400 degrees. I knew something wasn’t right because the turkey was making a lot of snap, crackle, and pop noises. When I checked the temperature I noticed the temperature disparity and corrected the dial, which brought the oven back into calibration.

There were two competing schools of thought during the cooking process. The first one was that I had accidentally turned our turkey into Lot’s Turkey, a solid pillar of salt. The other school was “it defrosted and it didn’t amount to crap.” and that the salt was pretty much just a silly affectation. I held out hope, mostly because of the sage words of Mr. Brown, whom I trust when it comes to food preparation and cooking.

We were a little taken aback when the temperature probe indicated that every part of the turkey had reached about 170 degrees, it was well and truly done. I asked, “How much juice is in the pan?” and the answer was “Not very much, if any. Only what it was basted with.” We had made enough of our own with the basting juices made with turkey broth concentrate and sauteeing the neck. I let the turkey settle for about ten minutes and then carved into it.

The meat was so moist and juicy that it fell apart as I carved into it. The entire dinner was spent marvelling at just how amazing it all was and how we’ll never do a turkey any other way than this. So simple, a saltwater bath for three days changes so much about a turkey! And just like Mr. Brown promised, the brine really shines for leftovers. The turkey is usually tough and dry as cardboard by the time its leftovers, but with the brined turkey it is nearly as amazing each time we take little out of the fridge for dinner it’s still amazing!

I can’t understand why everyone doesn’t brine their turkey. We’ll brine ours from now on, fresh and leftovers are just the tip of how amazing this is. The turkey probably was fully thawed in a little over a day! The three days just added to the brine’s power to make the bird juicy and amazingly flavorful.

Just for the record, the turkey wasn’t related to Lot at all, it wasn’t salty. It was amazing.

Geek Excursions: BitMessage

Along with my curiosity surrounding Bitcoin, there is a similar technology that has been released for public use called BitMessage. This system is a really neat way to securely communicate in a secure method that involves absolutely no trust whatsoever. It’s a completely decentralized email infrastructure and has captured a lot of my spare attention. BitMessage works a lot like how Bitcoin does, you can create email addresses on the fly, they are a long sequence of random characters that your system can display because you have both a public key and a private key. In a lot of ways BitMessage deals with the biggest problem surrounding PGP/GPG, which is key management. Nobody really wants to manage keys or use the system because it’s extra work. Plus even with PGP/GPG, your identity is written on your keys for everyone to see.

Getting started with BitMessage is a snap. First you need to download the BitMessage client, and you can get that at bitmessage.org. There’s a Windows and Mac client available, you can start it and be instantly attached to the BitMessage network, ready to create new “BitMessage Addresses” and throw them away just as easily. So, for example, you could reach me by sending me a BitMessage to this address: BM-2cWAk99gBxdAQAKYQGC5Gbskon21GdT29X. When you send a message using BitMessage, its to this address and from an address that your client makes, so the conversation occurs securely and since every node has a copy of the data it’s impossible to tell who is getting what information. I think an even more secure method would be to cross BitMessage with a PGP/GPG key. The only problem with a key like that is that classically PGP/GPG keys require that you include your email address as a subkey so that you can be identified by a human-readable email address when looking for your public key or when someone else is looking for it, to verify a signature for example. The PGP/GPG system doesn’t require an email address, you can of course create a public and private keypair using PGP/GPG and make the email address up from whole cloth, and instead just let people know the key ID that you want them to use. So technically if Alice wanted to secretly communicate with me, we could give each other our public keys to start and then use BitMessage as the messaging mule. I don’t see how any eavesdropper could make sense out of any of that data flow. It’s unclear what the contents are, the PGP/GPG encryption keeps the contents of the message secure, and BitMessage itself seriously obfuscates if not outright eliminates being able to tell where the messages are ultimately going to or coming from.

I have to admit that BitMessage is very user friendly and very handy to have. My only issue with it is that I don’t know anyone who uses it, but perhaps this blog post will change that. If you are interested in this bleeding-edge crypto/privacy software, I encourage you to chat me up on BitMessage for serious matters or for fun.