Crocodile Apologies

The media is starting to process the Cambridge Analytica misuse of Facebook data, and the story is only just getting some legs underneath it now. I see this as a reflective surface of the panic that we all felt back in November 2016, digging all that psychic turbulence back up again.

I want to focus more on Facebook itself. There have been several instances where Facebook has declared innocence publicly up until proof found, usually by journalists or investigators, and then when the truth comes out, Facebook stops, pauses, and issues an apology for their transgressions or mistakes. This reactivity is for me what lies at the core of my misgivings about the Facebook platform, and Facebook as a company.

In my opinion, it appears that Facebook is only chastened and contrite when caught red-handed doing something improper. I cannot trust a platform or a company that behaves this way. I honestly admit that I never really expected Facebook even to want to try to be upright and wholesome, I wanted them to, but all of this is similar to the feeling that I had when Google walked away from its mission statement “Do No Evil.” Facebook cannot be trusted.

There is no shock or surprise that Facebook has no tapeworm function available, only two options exist, leave everything alone or blow it all to kingdom come. I know there is a third path, the manual deletion of everything in the Activity Stream, but over ten years and quite a regular amount of use that is utterly impractical. Plus, I expect Facebook to be both capable and invested in retaining my data even if I think I’ve deleted it. Just because it no longer exists on the interface to me doesn’t mean that it is gone. I doubt thoroughly that even deleted accounts get deleted. I would bet money that they get hidden from view. It would not be in Facebook’s self-interest to lose any data they can get their hands on. I would also not put it past Facebook to also log every keystroke that goes into the text boxes on their site, so even if you don’t post anything, I would bet that Facebook has a record of what you did type and that you abandoned it. That they could record and store your unshared thoughts, indexing, and selling them even if you didn’t share. Logging into the Facebook site itself is a personal hazard to privacy. I have no proof of this last part, but I would fully expect a company like Facebook to do this very thing.

There is little that quitting Facebook will accomplish, since human personalities are quite fixed and constant constructs. We maintain that iron grip of control and Facebook has monetized it, and now, since Cambridge Analytica, they have lost it. Pandoras Box is open.

So why stop using Facebook then? Facebook must be caught being evil, which means that the intent is a stain that runs right to the core. I’ve abandoned Facebook itself because continued use is tacit approval of their offensive behavior, and if it makes them money through advertising revenue, and I’m a part of that? That’s personally unacceptable.

Going West With Facebook

Much like the elves in Tolkiens tales, sometimes the time is right to board the boats and head west. In this particular case, what to do with Facebook.

I’ve been using Facebook since July 2nd 2008. In the beginning it was wonderful, sharing and everyone seemed kinder, more conscientious, I suppose the world was better back then. Many people were looking for a new platform once LiveJournal collapsed, which if we are really serious about it, came when SixApart was sold to the Russians. Americans fled pretty much after that. And so, Facebook was a thing.

Mostly friends, it hadn’t taken off yet. Many of the later iterations that make Facebook the way it is today weren’t even thought up of back then, and in a lot of ways, it was better in the past. But then everyone started to join the service and we started to learn about the ramifications and consequences of using Facebook. I can remember that feeling of betrayal as Facebook posts were printed out and handed to my workplace management. That really was the first lesson in privacy and the beginning of the end of my involvement with Facebook.

Facebook has been on-again-off-again for a while. In time I realized that I was addicted to the service and the sharing. With enough time I realized that Facebook was actually fit more as a mental illness than an addiction. I had to stop it, because in a very big way, it was the service or my mental health.

So fleeing Facebook is the name of the game. First I downloaded all my content from the service, then I started to move the saved links from Facebook to Pocket for safekeeping. Then I went through and started hacking away at groups, pages, and apps. All of these tasks will be long-tailed, they’ll take a while for me to polish off because Facebooks tentacles run very deep, and in a rather surprising way, just how deep they actually go is remarkable.

So now I’m looking at writing more and sharing more from my Blog. This post is kind of a waypoint to this end. I installed a new theme with some new images featured, and the next step is to figure out a “Members Only” area where I can separate out the public from my friends. There are some items that I intend to write about that use specific names and I don’t want to play the pronoun game with my readers. I also don’t want hurt feelings or C&D notices, both of which some of my writing has created in the past.

I will detail my journey with disposing of Facebook here on this blog. I have eliminated publicity to Twitter and Facebook, but I left G+ on, because G+ is a desert.

So, here we go!

Extracting Cisco Unity 10.5 Voicemails

In my work, I wear many hats. Amongst these is VOIP Manager. It’s not really a job, or a position really but fits neatly under the heading of IT Manager, which is my position title. I oversee the companies Cisco CallManager and Unity systems.

Occaisonally when coworkers of mine leave employment, they sometimes leave behind voicemails in their Unity mailbox. I’ve been searching for a long while to find a convenient method to extract these voicemails out of Unity and into any other format that could be easily moved around so that other people could listen to the recordings and get somewhere with them.

I’ve tried a lot of options, and endless Google searches. I eventually discovered a rather involved method to acquire these messages. This method is something that I would categorize as “bloody hell” because it involves a lot of questionable hacking in order to procure the audio files.

The hack begins with Cisco Disaster Recovery System, known as DRS. If you have a Unity and CallManager system set up, like I do, you probably have already established the DRS and have it pointed somewhere where your backups live. In my case, I have the DRS pointed to a share that lives on my primary file server. So that’s where you start. You have to make sure that DRS is running, and that it generated good backups. This method essentally backdoors the backup system to get at the recordings that Unity takes.

In my Unity folder, I have two days worth of backups, and the files you need specifically are 2018-02-19-20-00-07_CUXN01_drfComponent.xml, and 2018-02-19-20-00-07_CUXN01_CONNECTION_MESSAGES_UNITYMBXDB1_MESSAGES.tar. Your filenames may be slightly different depending on what you named your Unity system. When I found these files, I didn’t even think anything of the XML file, but the tar file attracted my notice. I attempted to copy this to my MacBook and once there, attempted to unpack it with bsdtar. It blew up. As it turns out, Cisco made a fundamental change to DRS after Unity 7, they started encrypting the tar files with a randomized key, derived from the Cluster Security Password. My cluster is very simple, just Unity and CM, and I suppose also Jabber, but Jabber is worthless and so I often times forget it exists. It wouldn’t be that they would use .tar.enc, no, just .tar, which confuses bystanders. That is pretty much the way of things as Cisco, I’ve grown to appreciate.

The next tool you need is from a site called ADHD Tech. Look for their DRS Backup Decrypter. Its a standalone app on Windows and you need it to scan and extract the unencrypted tar data.

The next utility you will need is the DRS Message Fisher. Download that as well. I will say that this app has some rough edges, and one of them is that you absolutely have to run it in Administrator mode, otherwise it won’t function properly.

Start the DRS Message Fisher, select the tar file that has your message archive in it, decrypted, and then you can sort by your users aliases. Click on the right one, then it will present you with a list of all the voicemails the user has in that backup set. You would imagine that selecting all the messages would extract all the voicemails in individual files, but that is not how this application behaves. My experience is that you really should extract one message at a time, because the app dumps its saving folder after every request and cannot understand multiple selections even though you can make multiple selections. It is also eight years old, so that it functions at all is a miracle.

You start at the top, click the first message and then “Extract Message Locally” which should open a window and show you the result WAV file you need. I learned that without Administrator mode, you never ever get that folder, it just opens up your Documents folder and does nothing constructive. In case you need help finding it, look for it here:

C:\Program Files (x86)\Cisco Systems\DRS Message Fisher\TEMPMsgs

With the app in Administrator mode, and a message selected, click the button mentioned above. This will open the TEMPMsgs folder and show you the WAV file. Click and drag this anywhere else to actually save it. Then advance to the next message and extract, and so on and so forth until you have all the messages extracted. There wont be any actual useful data in the filename, it’s just a UUID, so I suppose we should be happy we are getting the audio and count our blessings.

Once you have all the WAV files you need, then you can dump the voicemail account and move on.

What a mess. I look at Cisco and marvel at the configurability of both Call Manager and Unity, but watch it trip embarrassingly hard on things like this. Apparently nobody ever cared that much to address voicemail survivability and extraction. As far as I know, this overwrought and wretched procedure is required to meet that particular need. It goes without saying, this is wholly and completely unsupported by Cisco TAC, so save yourself the headache of running full-speed into that bulkhead.

In many ways, this solution, if you can call it that, is getting voicemail survivability by dumpster diving. Ah well, it’s Cisco, complaining is very much like going down to the river and screaming at it to change course. You’d probably get further with the river.

Social Media Tent Flapping

I seem to vacillate between social media platforms these days. Since the collapse of my beloved Imzy, I’ve been kind of lurking about Facebook for a while. Facebook is rather unpleasant to use, mostly because the commentary is so awful. The only people to really blame are folk, the people who are on the platform aren’t really interested in communication, just trolling. So I’ve been looking back to Google Plus, and while the posts are still flowing there, and things are more intellectual on Google Plus, there’s no audience.

Which brings me to my blog. I damn near forgot it existed and then I discovered that it’s been down for probably the last five months because of file permission errors on the host that I use, iPage.com. Once I was able to correct the issue, the blog came back and with it some of the tools I use, like Blogo to write these posts in a convenient manner.

I also admit that moving to the Bear app has got me writing again in my journal, which I think is a really good thing. It appears that it may have spilled over into more activity for my blog. So if I’m paying for this hosting, I might as well use it.

I’d like to say there will be a steady stream of new articles. HAHAHAHAHAHAHAH. We’ll see about that. Maybe, maybe not.

Giving Chrome Some Pep

I’ve been using Google Chrome on my Macbook Pro for a long while, and I’ve noticed that some websites take some time to get moving along. In some ways, it feels like the browser is panting and trying to catch its breath. So today, while trying to solve a work problem I accidentally stumbled over a neat way to give my Chrome browser a little bit of a boost in performance. It seems to benefit when I use sites that are interactive, like my work help desk site or PNC online banking for example.

The trick is, create a small RAM drive on the system, and then copy the Chrome profile over, link to that profile so Chrome can find it, and then start to use Chrome. As Chrome works, things like settings and cache data go to RAM instead of the HD on my MacBook Pro. Then I use rsync to copy data into a backup folder just in case my MacBook pro suffers a kernel panic or something else that would accidentally dump the RAM drive.

There are a few pieces to this, mostly from scripts I copied off the network.

I copied the script called mount-tmp.sh and made only a few small adjustments. Specifically changed the maximum RAM drive size to 512MB.

Then I created two different bash scripts to check-in the profile to the RAM drive and then to check-out the profile from the RAM drive back to the HD. Since I wrote them from scratch, here they are:

check-in.sh


#!/bin/bash
/Users/andy/mount_tmp.sh
mv /Users/andy/Library/Application\ Support/Google/Chrome/Default ~/tmp
ln -s /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome/Default
rsync -avp -delete /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome/Default_BACKUP
echo “Complete.”

check-out.sh


#!/bin/bash
rsync -avp -delete /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome/Default_BACKUP
rm /Users/andy/Library/Application\ Support/Google/Chrome/Default
mv /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome
/Users/andy/mount_tmp.sh umount
echo “Complete.”

If you give this a shot as well, I would love to hear from you about your experiences with this little speed improvement hack! Hope you enjoy!

Moment of Geek: Raspberry Pi as Thermal Canary

A few days ago I had run into a problem at work. The small Mitsubishi Air Conditioner had decided to take a cooling nap in the middle of the day. So my office, which is also the machine room at work was up around 85 degrees Fahrenheit. I was used to this sort of thing, summers bringing primary cooling systems to their knees, but this time I had a huge A/C unit in the ceiling that I elected not to have removed and left in place, just in case. So I turned it on, set it’s thermal controller to 70 degrees and the room temperature tumbled in about ten minutes. Right after the room temperature was normal, and I had service out to visit me about my little wall-mounted A/C unit, the damn thing started functioning normally again. The tables turned on IT, where for our users, this is what happens to them. They can sit there and struggle, and then we arrive and the machines behave themselves like nothing at all was wrong.

So I had the big A/C, and it’s smaller wall-mounted unit both running overnight and faced a problem. I want to know what the temperature is in my machine room without having to buy a TempPageR device. I had one long ago, and it was rather expensive. I looked on my desk and noticed my Raspberry Pi, just sitting there, doing nothing of consequence. I did a brief cursory search on Google, and I knew the Raspberry Pi had a CPU Temperature interface hidden somewhere, and I was happily surprised to find a website detailing how to use this exact feature in Python programming language to write a temperature log, and optionally graph it. It was mostly copypasta, adapting things I had found online pretty much by copy and paste and hammering them here and there to work. I have programming skills, but they are rather dated and rusty. Plus I’ve never used Python, specifically. So my first effort was successful, I got a 1-second temperature logger in place. I was rather happily satisfied with my efforts, but I knew I would not be happy with Celsius, but I knew the temperature was colored by the CPU in the Raspberry Pi itself, so the reported temperature was quite higher than the room temperature.
I started to tinker. First searching for the equation to convert C into F. So I got it, 115 degrees. When I turned on the big A/C device, and its thermal controller displayed the ambient room temperature in F, 74. So I did some math and subtracted a constant 44 degrees from the CPU temperature, which “calibrated” the CPU temperature to be a rough approximation to the room temperature. Some eagle-eyed readers may notice that my math is off, but after I had moved the Pi over to the server stack, I had to adjust for a higher CPU temperature because of it being further away from the wall A/C unit. So now I had a 1-second temperature logger. I turned on graphing, and the entire program crashed and burned, I wasn’t running the application in an X-Windows environment, so I tore the graphing library and code out because I was never going to use the graphing feature anyways.

That, of course, was not enough to replace the TempPageR device. I needed some alarm system to alert me to what was going on. I thought of some interfaces, email, SMS, iMessage, email-to-telephone-call cleverness and each thought brought me against different versions of the cliffs of insanity. I could have probably smashed and hacked my way to a solution involving some ghastly labyrinth of security settings, passwords hashed with special algorithms that are only available on ENIAC computer simulators that only run on virtualized Intel 8086 processors with the Slovenian language pack loaded and using the Cyrillic character set; An arrangement that was an epic pain in the ass. So earlier in the day, I had tripped over an app advertisement for Slack so that it could use incoming data from the Pingometer website. I have a Pingometer account, a free one because I’m a cheap bastard. The single pinger externally checks my fiber optic connection at work, keeping AT&T on their toes when it comes to outages. The Pingometer website uses incoming Slack webhooks. An incoming Slack webhook comes from some source that makes a really simple web browser call using HTTP. It wraps JSON into HTTP and sends the request to Slacks servers. Slack then does everything needed to make sure the message is pretty and ends up on the right Slack channel, on the right team; this was my alert mechanism.

So I did another Google search, found the intersection between Linux, Python, and Slack and some more copypasta and some tinkering and I had a Python app that displayed the room temperature in Degrees F, and made my Slack a noisy mess, as it was sending incoming webhook requests every second. One more tweak, which was a super-simple IF-THEN block, set my high-temperature mark at 90 degrees F and let it go.

 

There is something satisfying about being able to hack something together, cobble it actually, and have it work without blowing up on the terminal, blowing up Slack, or otherwise failing. So now I have a $35 Raspberry Pi running as a rough temperature alarm, it’ll send alerts to Slack and let me and my System Admin know at the same time over Slack. I’m quite happy with how it all worked out. No obnoxious email settings, ports, security frameworks, awkward and obtuse hashing routines, just a single JSON-formatted HTTP call and BAM. All set. An alarm, with a date and time stamp and a temperature, delivered right onto my iPhone with automatic notifications from Slack, so it wakes me up if I need it.

So anyways, without further ado, here is the code:


from gpiozero import CPUTemperature
from time import sleep, strftime, time
import json
import requests

# Set the webhook_url to the one provided by Slack when you create the webhook a
t https://my.slack.com/services/new/incoming-webhook/
webhook_url = ‘https://hooks.slack.com/services/####/#####’

cpu = CPUTemperature()

def write_temp(temp):
with open(“cpu_temp.csv”, “a”) as log:
log.write(“{0},{1}\n”.format(strftime(“%Y-%m-%d %H:%M:%S”),str(temp)))
if temp > 90:
slack_data = {‘text’: “{0},{1}\n”.format(strftime(“%Y-%m-%d %H:%M:%S”
),str(temp))}
response = requests.post(
webhook_url, data=json.dumps(slack_data),
headers={‘Content-Type’: ‘application/json’}
)
if response.status_code != 200:
raise ValueError(
‘Request to slack returned an error %s, the response is:\n%s’
% (response.status_code, response.text)
)

while True:
temp = cpu.temperature
temp = (9.0/5.0 * temp + 32) – 44
write_temp(temp)
sleep(1)


It has been forever since I’ve needed to program anything. Once I was done, and I saw it work the way I wanted it to, I was quite happy with myself. I haven’t felt this particular sense of accomplishment since my college years. It was quite a welcome feeling.

Walking Down Memory Lane

Some notable events from other July 1st’s

2003 – Installed a network aware fax machine, and then attached it to Groupwise. My god, Groupwise. This is such a walk down memory lane! And this of course was the first of a repeated meme that online shared mailboxes at work are upsetting to people because they aren’t “private”, in the same way that a regular fax machine is “private” by hovering over it and muscling out anyone who might try to use it. It of course begs the question, what are you transmitting at work that is “private”, that you shouldn’t be doing at say, a FedEx shop or Office Depot?

2003 – Toppenish, Washington was in the news because a keyword blocker at a library got upset because it found something it didn’t approve of in the text of the domain name itself. Nowadays we don’t search domains for text fragments, we actually categorize them.

2004 – Again with the Fax Machine. In this case, not having long distance on the line requiring the use of an AT&T calling card, with a 60-digit calling sequence just to send a fax far away. And the merry mixups when people who work for an Institution for Higher Learning demonstrate no higher learning by being unable to comprehend digits. Ah, those were the days.

2004 – Farhenheit 9/11 – Hah, those were the days, weren’t they? When it only felt like scandals were rare and maybe all the crazy conspiracy theories were just theories. Oh, the memories.

2006 – Sharing the photos of the bathroom rebuild. It was a long while ago that we tore the guts out of that bathroom and updated it.

2007 – At O’Hare, running through security, on my way to visit family in Syracuse.

2008 – Another trip to Syracuse. This time through Detroit.

2009 – The problem with the cloud is poor security and access points everywhere. What happens when people plant incriminating evidence via a route, like junk mail, that you pay very little attention to – and then make an anonymous tip about the evidence? It was an interesting consideration and helps reinforce how important it is to keep everything digital tidy.

2013 – I wrote a lot of things about the security threat that our very own NSA represents. And little did he know that in 2017, the tools they collected and wrote would leak out and turn into WannaCry ransomware attack. Thanks NSA!

2015 – Facebook Notifications get an enhancement and they can accept a GPG Public Key, so all the Facebook Notifications over email are all encrypted. This was a really good proof-of-concept option from one of the worlds biggest Internet sites, alas it won’t ever take off because GPG is an all-or-nothing technology, and since you aren’t going to have all, all you get is nothing. It was this day that I also gave a lot more thought to The Golden Rule and started to reshape my life around it as a moral compass.

 

Blogo Test

One of the biggest headaches with my WordPress blog is remembering to write new posts with any frequency. It sometimes comes down to a test of many editors, which one do I like, and how smooth is the learning curve to upload my story to my blog post? Email is a pain, mostly because the instrumentation to add all the extra bits is rather annoying and I don’t really want to revisit and markup a blog entry after I’ve written it. Then after that, I looked at BBEdit, which is the big-brother to TextWrangler. The folks who wrote that application provided a free trial of BBEdit, and gamely informed me that TextWrangler was a dead duck. I never really got engaged with BBEdit enough to think of it as a source for blogging, and TextWrangler is still pretty good for what I need.

Since I’ve had this blog for a long while, and I feel a little bad about neglecting it, perhaps it’s time to dust off the old blogging tools and hop back into it. Facebook is the Walmart of Social Media, it’s everywhere you look, but you feel dirty using it because you know every time you walk in you’re being measured and indexed and categorized.

Facebook, like Walmart, brings a ready audience to the party, people who are happy enough to just waddle through and notice what you have to write and maybe drop a like down. Blogging has always been longer-form than Facebook, and way longer than Twitter. Plus since I host my own blog on my own domain, I can write things here that I wouldn’t or can’t write in other places.

So this test will see how well this little app called Blogo works on my MacBook Pro. If it’s good, we’ll likely have more stories appear in the blog in the future.

DayOne 2.0 to Evernote Migration

Years ago I started to write a personal journal. Mostly the application I used was Microsoft Word, sometimes other text editors, and I’ve always been on the search for a better way to conduct my journaling habit. When I started using Apple’s Mac, I discovered Bloom Built’s DayOne journaling software. DayOne came highly recommended on many pro-Mac websites, so I bought in. Everything was going fine with journaling for a while, and then I noticed that the authors of DayOne were going to release DayOne 2.0. I eagerly jumped onboard with this update and forged ahead. The feature set was welcome, multiple journals, a more refined syncing experience with an online service run by the manufacturer (turned out to be AWS), and I was rather happy. The new syncing system also came with extensions for one of my favorite online time-savers, IFTTT. What more could I ask for?

Then the updates started. I noticed in my Mac App Store application that there was a listed update for DayOne 2.0, and I clicked on it, and the system acted as if there was no update present. A deviation from the expected behavior that I thought was some glitch or bug. So I dived into the problem and searched Google for hints. There were so many options in the Google Index that I figured one of them should solve my problem. In the end, I had completely roto-rootered the entire Mac App Store. I tried one last time, and still, DayOne 2.0 refused to update. I gave up, figuring that was something that maybe a reinstallation of the Operating System would solve because it was a behavior that was unexpected and this sort of thing doesn’t happen with Apple products in my common experience. So then, resistant to being beaten by a bug I forced the issue with the App Store and tried to download the update to DayOne 2.0. I discovered to my chagrin that the update required the next edition of the Mac OS, El Capitan. I have a vested interest in staying with Yosemite; I’m happy with my MacBook using Yosemite, so why should I upgrade the OS to satisfy an application?

The next injury came shortly after that. While using DayOne 2.0, I was rather miserable since the software acted so downright sluggish. I would type, and the application would just pinwheel or pause, and then in a blur, all my words would spill into the display with the same rate at which I type. I wasn’t getting the instant response to keyboard actions that I was expecting. I verified that other applications behaved properly, TextWrangler, for example, behaves perfectly fine to my expectations, so this isn’t a system problem, it’s a DayOne 2.0 problem. Previously to this, I had sprung for a copy of Little Snitch on my Mac to help me better control my network interfaces. Little Snitch had options to block an application from accessing the network. So on a lark, I figured I would test the sluggish application, DayOne 2.0 by blocking its network access with Little Snitch. It was like turning a lightswitch! The sync component was broken, showing a red exclamation mark, but man, text entry was back to normal, and tag entry was super quick as well. I didn’t have to wait for pinwheel after pinwheel to get where I was going. I wanted to journal, to get my text entered into the system for safekeeping and remembering. So for a while, I would use Little Snitch to damage DayOne 2.0 so I could use the application the way I wanted to, the way I expected to. I then wrote to Bloom Built and asked them if they would update the application for the users who didn’t want to march forward with El Capitan or Sierra, and declined. It was a longshot, but I figured it was in their best interest to address their application to the largest group of users, and that would presumably mean even people using Yosemite. It wasn’t to be.

So then after thinking about it for a while, and growing weary of the rather extended procedure to get Little Snitch to help me block DayOne 2.0’s broken sync routines, I made the fateful decision to switch my journaling to Evernote. Why Evernote? Because it was on all my devices, just like DayOne 2.0 (at least Mac devices), and Evernote already had integrations with IFTTT, so that was set. Evernote was something I knew, and the Evernote syncing routines were significantly better than DayOne’s syncing routines. Saying that has to be tempered by the fact that sometimes Evernote’s syncing routines also break, but the one-off hard-to-diagnose sync error is better than a broken sync routine that throws pinwheels when you type text or try to enter tags, as it is with DayOne 2.0. Evernote also has one extra feature, which wasn’t a part of the decision but now that I’ve made the switch, I’m glad for, and that is you can highlight text in Evernote and encrypt it using AES. This is something that DayOne 2.0 had as a promise, but they were by all appearances dragging their heels when it came to journal security.

I then started writing all my new journal entries in Evernote. That was very straightforward. However I left about 11,000 entries behind in DayOne 2.0. I started looking at the ways to get that data out of DayOne 2.0. There are a few options, the creation of text data, PDF data, HTML data, or JSON data. So I started extracting entries out of my DayOne 2.0 journal trying to import them into Evernote. What I wanted was individual entries to move over to Evernote and be individual entries there as well. Everything that comes out of the exporter in DayOne 2.0 comes out as chunks. One big HTML file, one big PDF file, one big JSON file, and one big Text file. There is no easy way to get individual entries out one-at-a-time unless you wanted to manually slog through every single entry. At 11,000 entries, that wasn’t going to happen. I have no patience for that. So then I started to look at ways to hack my DayOne 2.0 exports, since the people that wrote DayOne 2.0 didn’t have anything helpful, and all the other tools I found online were solely written for DayOne 1.0, something I couldn’t use. I didn’t have a Journal.dayone file, I had an AWS hosted JSON chunk. So the hackathon commenced. HTML was a giant headache, since there isn’t any way to easily split HTML up into chunks, syntactically speaking, at least not with the data that DayOne 2.0 exports. The PDF was a mess, one immense PDF and the text was in 8-point, it’d be fine if I was 20 years old, and didn’t mind slogging through a monolithic PDF file for a date. I even tried to hack around JSON in my limited way. I got JSON out to CSV but then realized that my instinct to make the CSV a data source for a mail-merge and mail-merge my journal out to individual entries was going to be a bust. Macs don’t do mail merge at all. I made peace with that a long while ago, not that I ever had any work that needed mail merge. So there was only one format left, the most basic format, text.

DayOne 2.0 spits out a journal into one monolithic text export file. So I have to figure out how to hack this text file up into pieces. I spent a long while with the bash terminal, screwing around with csplit and discovering the subtle differences between Apple’s implementation of csplit and GNU’s implementation of csplit. After a night of blind hacking, I gave up on csplit. Of course, by this time I had also given up on DayOne 2.0, it wasn’t the application I wanted anymore. My feelings had soured against the manufacturer, for only going so far with their export code and leaving the rest for me to hack out on my own. I was irritated and felt gypped that they didn’t just go one step further and include an “export individual entries” checkbox somewhere. But I got over my funk; I burned that bridge there was no reason to keep on complaining about it. I was moving to Evernote and Bloom Built was pretty much post-fire, all sad ashes soaked with water. Nights of searching and hacking on this monolithic text file and I eventually found the solution. The first step comes with Perl:

#!/usr/bin/perl

undef $/;
$_ = <>;
$n = 0;

for $match (split(/Date:\t/)) {
open(O, ‘>temp’ . ++$n);
print O $match;
close(O);
}

This little script is something I found through Google. I’m far too lazy to hack this out on my own if I’m brutally honest. The keyword in DayOne 2.0 entries in this monolithic text file is “Date:” followed by a tab character. Every entry starts with this key. So, export my DayOne 2.0 journal to Journal.txt, and then run this script against it: ./split.pl Journal.txt. Perl tears the file into perfect chunks ready for action. But the files are temp001, temp002, temp003, so on and so forth. Two lines then add the last crowning bits to each file. The first tacks on a txt extension and the second one grabs the first line of each file and makes that line the new filename. In DayOne 2.0, the first line is the date line. So now my entries have their dates as their filenames. This is just a compromise, I would have much preferred to have the dates preserved in the file metadata, but hey, you get what you get:

for f in temp*; do mv $f $f.txt;done
for f in temp*; do mv $f “$(head -n 1 $f).txt”;done

So for my test journal, I exported from DayOne 2.0 into Text, chopped it all up using Perl, and used the bash shell to hack the names to where I was happy. Then lasso the entire batch of files and drag them into Evernote. Once I had this done for all my journals, I closed DayOne 2.0 and left it alone. There is no point in trashing it, let it dwell on in a ghostly non-life for all I care. Evernote at least has proper behavior when it comes to text entry, tag entry, and the syncing routines are better. Plus Evernote will never abandon me the way Bloom Built did. They’ll never stop updating Evernote for Yosemite, or if they do, it’ll be so far down the pike that I get a new laptop and all of this is just so much foolish wrangling anyways.

In the end, I won. I won against an annoying choice made by a company I used to love; I won against a file format that seems so dumb, and I was able to shoehorn years of journaling without having to abandon my past or make it so annoyingly inaccessible that it would be the same as abandoning it.

If you find an interest in switching from DayOne 2.0 to Evernote, this is one way to do it. There may be better ways, clever ways to convert the JSON to the Evernote import file format, perhaps. But I didn’t care enough to slog through JSON, this is my way, in all its dumb glory. Of course, my tags in DayOne 2.0 are shot, and the tagging in Evernote is a manual affair, so that was another little compromise. Perhaps if I have dull weekends or evenings, I can hack out the tags over time. Having the entries and losing the tags is an acceptable loss. At least I no longer need to force Little Snitch to break DayOne 2.0 so I can use it. Heh, that’s still something that makes me shake my head in disbelief. That you have to do it this way is such a mess.

New Year Resolutions

This new year I resolved to be done with Twitter, Facebook, and Reddit. I had abandoned Twitter a long time ago, Reddit was easy as I was never really invested in that platform anyways, and then most recently leaving Facebook behind.

It needs a little characterization. I haven’t deleted my Facebook account, but what I have done is ceased to engage on that platform. I still do check in once a week just to mop up my timeline recommendations from people putting my name on their posts and otherwise just establishing a heartbeat there so that the people who are on the service and follow me notice that I still live. I suppose that eventually even bothering with the heartbeat updates will grow tiresome and I’ll give up on that as well.

I have instead moved my entire social networking existence to a new service called Imzy. It’s at imzy.com, and I encourage everyone to join me there. There are some pretty good AUP rules in place and communities can also have extended rules, building off the core AUP of the site itself. Imzy is a perfect place to have real discussions with people online. There is a culture in Imzy which I haven’t found anywhere else. It’s this lack of trolling that I witnessed and it’s what led me to dump Facebook.

I don’t know what this means for this blog. Imzy is a great platform all on its own, and when it comes to blogging, my user community has a lot of features that my blog can’t meet. The sense of community I think is what is missing from a lot of services, and my blog. This service is mostly just a billboard for me to yell at the darkness. There aren’t any real conversations going on here, unlike in Imzy.

I figure if I don’t post more blog entries I may just archive all of this stuff and shutter the service completely. Then again, I may just be lazy and let this blog long-tail it to eternity. Only time will tell.