Extracting Cisco Unity 10.5 Voicemails
In my work, I wear many hats. Amongst these is VOIP Manager. It’s not really a job, or a position really but fits neatly under the heading of IT Manager, which is my position title. I oversee the companies Cisco CallManager and Unity systems.
Occaisonally when coworkers of mine leave employment, they sometimes leave behind voicemails in their Unity mailbox. I’ve been searching for a long while to find a convenient method to extract these voicemails out of Unity and into any other format that could be easily moved around so that other people could listen to the recordings and get somewhere with them.
I’ve tried a lot of options, and endless Google searches. I eventually discovered a rather involved method to acquire these messages. This method is something that I would categorize as “bloody hell” because it involves a lot of questionable hacking in order to procure the audio files.
The hack begins with Cisco Disaster Recovery System, known as DRS. If you have a Unity and CallManager system set up, like I do, you probably have already established the DRS and have it pointed somewhere where your backups live. In my case, I have the DRS pointed to a share that lives on my primary file server. So that’s where you start. You have to make sure that DRS is running, and that it generated good backups. This method essentally backdoors the backup system to get at the recordings that Unity takes.
In my Unity folder, I have two days worth of backups, and the files you need specifically are 2018-02-19-20-00-07_CUXN01_drfComponent.xml, and 2018-02-19-20-00-07_CUXN01_CONNECTION_MESSAGES_UNITYMBXDB1_MESSAGES.tar. Your filenames may be slightly different depending on what you named your Unity system. When I found these files, I didn’t even think anything of the XML file, but the tar file attracted my notice. I attempted to copy this to my MacBook and once there, attempted to unpack it with bsdtar. It blew up. As it turns out, Cisco made a fundamental change to DRS after Unity 7, they started encrypting the tar files with a randomized key, derived from the Cluster Security Password. My cluster is very simple, just Unity and CM, and I suppose also Jabber, but Jabber is worthless and so I often times forget it exists. It wouldn’t be that they would use .tar.enc, no, just .tar, which confuses bystanders. That is pretty much the way of things as Cisco, I’ve grown to appreciate.
The next tool you need is from a site called ADHD Tech. Look for their DRS Backup Decrypter. Its a standalone app on Windows and you need it to scan and extract the unencrypted tar data.
The next utility you will need is the DRS Message Fisher. Download that as well. I will say that this app has some rough edges, and one of them is that you absolutely have to run it in Administrator mode, otherwise it won’t function properly.
Start the DRS Message Fisher, select the tar file that has your message archive in it, decrypted, and then you can sort by your users aliases. Click on the right one, then it will present you with a list of all the voicemails the user has in that backup set. You would imagine that selecting all the messages would extract all the voicemails in individual files, but that is not how this application behaves. My experience is that you really should extract one message at a time, because the app dumps its saving folder after every request and cannot understand multiple selections even though you can make multiple selections. It is also eight years old, so that it functions at all is a miracle.
You start at the top, click the first message and then “Extract Message Locally” which should open a window and show you the result WAV file you need. I learned that without Administrator mode, you never ever get that folder, it just opens up your Documents folder and does nothing constructive. In case you need help finding it, look for it here:
C:\Program Files (x86)\Cisco Systems\DRS Message Fisher\TEMPMsgs
With the app in Administrator mode, and a message selected, click the button mentioned above. This will open the TEMPMsgs folder and show you the WAV file. Click and drag this anywhere else to actually save it. Then advance to the next message and extract, and so on and so forth until you have all the messages extracted. There wont be any actual useful data in the filename, it’s just a UUID, so I suppose we should be happy we are getting the audio and count our blessings.
Once you have all the WAV files you need, then you can dump the voicemail account and move on.
What a mess. I look at Cisco and marvel at the configurability of both Call Manager and Unity, but watch it trip embarrassingly hard on things like this. Apparently nobody ever cared that much to address voicemail survivability and extraction. As far as I know, this overwrought and wretched procedure is required to meet that particular need. It goes without saying, this is wholly and completely unsupported by Cisco TAC, so save yourself the headache of running full-speed into that bulkhead.
In many ways, this solution, if you can call it that, is getting voicemail survivability by dumpster diving. Ah well, it’s Cisco, complaining is very much like going down to the river and screaming at it to change course. You’d probably get further with the river.
Percentile Taxation and Citizenship
The media is awash in talk about socialized healthcare, taxation, and immigration. I do not claim to be an expert in any of this, and probably something like this would not work out, but on a lark, I started to cast about in a kind of brainstorming session about how I might solve the taxation problem and the issues surrounding citizenship.
The opening gambit is socialized healthcare, also known as Single Payer. Let’s just call it healthcare moving forward as a shortcut for what we’re really talking about. The next series of moves are crafted as a kind of chess game, with different pieces being different cliched arguments:
- We don’t want the poorest to suffer and die, it doesn’t conform to the moral standard of the three faiths, so we must act.
- It is very expensive for some, and thoughtlessly free for others. The old are expensive, the young are not, and there are outliers everywhere.
- The government is already up to its neck in debt, how can we saddle ourselves with more?
- Nobody can escape any of the above points. Otherwise, they will appear to be a hypocrite.
The challenge of healthcare is how to pay for it. Healthcare is rather expensive at the start, but in the long-term, it is actually cheaper than what we have right now. How can we afford such a thing as a society and also keep many of the other services that we have come to expect from our government? The best answer, the most common one is to reformulate taxation.
Taxation
It seems as the tax code is the most complicated subject in all of government. We keep on making attempts to address what is fair and just and depending on the political winds, it changes from generation to generation. I am not going to make any claims for practicality, this is brainstorming, not policy.
There have been many plans over the years. Flat taxes, graduated taxes, and many economic theories such as trickle-down economics that has been featured while I’ve been alive, since 1975. This plan is just another possibility, and I don’t know if it would actually work out, but it was the first thing I thought of, which launched this blog post.
How about a taxation plan based on percentiles? You take all citizens that are not disabled, you list out all their incomes, something for which already exists in the IRS. Then you order everyone from smallest income to highest income. We will dispense with all the tax loopholes, and resolve to simplify everything down to raw income. For businesses, we will do the same, their profits ranked from smallest to largest. Then from there, we calculate the percentile rank across the entire gamut for both classes of entity, people, and corporations. Those at the bottom pay next to no tax, while those at the very top pay almost all tax. The percentile rank makes calculating where you sit rather easy. The IRS can calculate this value and send out a postcard letting you know. Since there are no more loopholes, there is no more need for complicated forms and instructions. The withholding is done by employers, the IRS settles all accounts, and every April 15th you either get a bill or a check.
Everything with taxation is wrapped up with politics. Because of this, and because of politics, there are counter-arguments for and against any sort of change to taxation. The most common retort to a change in taxation like this, where the rich would have to pay an exceptionally high tax, is the argument that they would just leave the country to avoid the tax. So then we come to the next section…
Citizenship
We are all citizens of the United States of America. Many of us acquired that citizenship by nativity. We didn’t do anything to earn or deserve it other than have the luck to be born in the right place at the right time. Currently, citizenship and immigration is a hot-button issue. Many people want to come to the United States, and so over time, we have started to reduce and control immigration to our country. Very recently, I have noticed a rather unpleasant nationalistic nativism which is adding new discrimination to this process. We aren’t holding the lamp by the golden door, as much as we want to search a line-up and cherry pick the very best to join our country.
Citizenship provides rights, privileges, and abilities that people without US Citizenship may not have. We have started to covet this citizenship both economically and culturally. It is something we have, and something we want to keep to ourselves. What lies at the heart of citizenship? We are all part of a greater whole, the dream of America, and we get immense benefits from that, and so we must meet the cost by paying taxes. Taxes pay for citizenship and civilization. If you want to play, you have to pay. Also, if you want to pay, you can play.
Those that wish to immigrate to the United States should be willing to agree to our taxation. Those that do not agree with our taxation should be excluded from citizenship. They are unwilling to pay for it, so why should they take advantage of it? So, when it comes to the Dreamers, they are all paying taxes so they can be citizens. If someone who is exceptionally rich doesn’t want to pay taxes, they can abdicate their responsibilities to society at the cost of their citizenship. They can, of course, re-acquire the citizenship as easily as anyone else, by agreeing to pay taxes based on their income.
Final Thoughts
I don’t really suppose any of this would be actually practical, but amidst all the arguments currently being discussed, why not at least touch on these ideas? In the current political climate, there is an exceptional number of interested parties, and the quality of discourse is more varied than it has ever been before. I’m sure if anyone reads this post, they will have strong responses, and I welcome the commentary, but I reserve the right not to respond if there is no point to it. As I said before, this is not policy, this is brainstorming. Please keep that in mind if you are upset.
Social Media Tent Flapping
I seem to vacillate between social media platforms these days. Since the collapse of my beloved Imzy, I’ve been kind of lurking about Facebook for a while. Facebook is rather unpleasant to use, mostly because the commentary is so awful. The only people to really blame are folk, the people who are on the platform aren’t really interested in communication, just trolling. So I’ve been looking back to Google Plus, and while the posts are still flowing there, and things are more intellectual on Google Plus, there’s no audience.
Which brings me to my blog. I damn near forgot it existed and then I discovered that it’s been down for probably the last five months because of file permission errors on the host that I use, iPage.com. Once I was able to correct the issue, the blog came back and with it some of the tools I use, like Blogo to write these posts in a convenient manner.
I also admit that moving to the Bear app has got me writing again in my journal, which I think is a really good thing. It appears that it may have spilled over into more activity for my blog. So if I’m paying for this hosting, I might as well use it.
I’d like to say there will be a steady stream of new articles. HAHAHAHAHAHAHAH. We’ll see about that. Maybe, maybe not.
Journal Migration
Just as I had migrated from Day One to Evernote, I got really tired of Evernote and it’s bloated sluggishness. So I moved my Journal again to an app platform linked to the Bear app on iOS. This application is really quite useful and a joy to use on my Macbook. I tried to sync my journal to my iOS devices, but I had less luck with that. I am however getting pretty top-notch support from the people who write the app, so for that, it’s working out really well. I can use the platform with hope of a app fix for what ails my journal, as long as that hope lasts.
One of the most compelling parts of the Bear app is its tagging platform. It’s almost the perfect thing, but as I wrote to Evernote as a enhancement request, I would like tags to be optionally indexes as well. What I mean by this is when I make a tag, that there be an optional checkbox or slider for making it an index entry. So when I create a new index entry, the software scans the content of my journal for that tag and if it finds it, adds the tag to the entry. So far, I haven’t found any apps that do that and sync across devices. But so far, Bear is nice to work with. If you are interested in seeing what it is like, you can get it for free from the Mac App Store.
Giving Chrome Some Pep
I’ve been using Google Chrome on my Macbook Pro for a long while, and I’ve noticed that some websites take some time to get moving along. In some ways, it feels like the browser is panting and trying to catch its breath. So today, while trying to solve a work problem I accidentally stumbled over a neat way to give my Chrome browser a little bit of a boost in performance. It seems to benefit when I use sites that are interactive, like my work help desk site or PNC online banking for example.
The trick is, create a small RAM drive on the system, and then copy the Chrome profile over, link to that profile so Chrome can find it, and then start to use Chrome. As Chrome works, things like settings and cache data go to RAM instead of the HD on my MacBook Pro. Then I use rsync to copy data into a backup folder just in case my MacBook pro suffers a kernel panic or something else that would accidentally dump the RAM drive.
There are a few pieces to this, mostly from scripts I copied off the network.
I copied the script called mount-tmp.sh and made only a few small adjustments. Specifically changed the maximum RAM drive size to 512MB.
Then I created two different bash scripts to check-in the profile to the RAM drive and then to check-out the profile from the RAM drive back to the HD. Since I wrote them from scratch, here they are:
check-in.sh
#!/bin/bash
/Users/andy/mount_tmp.sh
mv /Users/andy/Library/Application\ Support/Google/Chrome/Default ~/tmp
ln -s /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome/Default
rsync -avp -delete /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome/Default_BACKUP
echo “Complete.”
check-out.sh
#!/bin/bash
rsync -avp -delete /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome/Default_BACKUP
rm /Users/andy/Library/Application\ Support/Google/Chrome/Default
mv /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome
/Users/andy/mount_tmp.sh umount
echo “Complete.”
If you give this a shot as well, I would love to hear from you about your experiences with this little speed improvement hack! Hope you enjoy!
Moment of Geek: Raspberry Pi as Thermal Canary
A few days ago I had run into a problem at work. The small Mitsubishi Air Conditioner had decided to take a cooling nap in the middle of the day. So my office, which is also the machine room at work was up around 85 degrees Fahrenheit. I was used to this sort of thing, summers bringing primary cooling systems to their knees, but this time I had a huge A/C unit in the ceiling that I elected not to have removed and left in place, just in case. So I turned it on, set it’s thermal controller to 70 degrees and the room temperature tumbled in about ten minutes. Right after the room temperature was normal, and I had service out to visit me about my little wall-mounted A/C unit, the damn thing started functioning normally again. The tables turned on IT, where for our users, this is what happens to them. They can sit there and struggle, and then we arrive and the machines behave themselves like nothing at all was wrong.
So I had the big A/C, and it’s smaller wall-mounted unit both running overnight and faced a problem. I want to know what the temperature is in my machine room without having to buy a TempPageR device. I had one long ago, and it was rather expensive. I looked on my desk and noticed my Raspberry Pi, just sitting there, doing nothing of consequence. I did a brief cursory search on Google, and I knew the Raspberry Pi had a CPU Temperature interface hidden somewhere, and I was happily surprised to find a website detailing how to use this exact feature in Python programming language to write a temperature log, and optionally graph it. It was mostly copypasta, adapting things I had found online pretty much by copy and paste and hammering them here and there to work. I have programming skills, but they are rather dated and rusty. Plus I’ve never used Python, specifically. So my first effort was successful, I got a 1-second temperature logger in place. I was rather happily satisfied with my efforts, but I knew I would not be happy with Celsius, but I knew the temperature was colored by the CPU in the Raspberry Pi itself, so the reported temperature was quite higher than the room temperature.
I started to tinker. First searching for the equation to convert C into F. So I got it, 115 degrees. When I turned on the big A/C device, and its thermal controller displayed the ambient room temperature in F, 74. So I did some math and subtracted a constant 44 degrees from the CPU temperature, which “calibrated” the CPU temperature to be a rough approximation to the room temperature. Some eagle-eyed readers may notice that my math is off, but after I had moved the Pi over to the server stack, I had to adjust for a higher CPU temperature because of it being further away from the wall A/C unit. So now I had a 1-second temperature logger. I turned on graphing, and the entire program crashed and burned, I wasn’t running the application in an X-Windows environment, so I tore the graphing library and code out because I was never going to use the graphing feature anyways.
That, of course, was not enough to replace the TempPageR device. I needed some alarm system to alert me to what was going on. I thought of some interfaces, email, SMS, iMessage, email-to-telephone-call cleverness and each thought brought me against different versions of the cliffs of insanity. I could have probably smashed and hacked my way to a solution involving some ghastly labyrinth of security settings, passwords hashed with special algorithms that are only available on ENIAC computer simulators that only run on virtualized Intel 8086 processors with the Slovenian language pack loaded and using the Cyrillic character set; An arrangement that was an epic pain in the ass. So earlier in the day, I had tripped over an app advertisement for Slack so that it could use incoming data from the Pingometer website. I have a Pingometer account, a free one because I’m a cheap bastard. The single pinger externally checks my fiber optic connection at work, keeping AT&T on their toes when it comes to outages. The Pingometer website uses incoming Slack webhooks. An incoming Slack webhook comes from some source that makes a really simple web browser call using HTTP. It wraps JSON into HTTP and sends the request to Slacks servers. Slack then does everything needed to make sure the message is pretty and ends up on the right Slack channel, on the right team; this was my alert mechanism.
So I did another Google search, found the intersection between Linux, Python, and Slack and some more copypasta and some tinkering and I had a Python app that displayed the room temperature in Degrees F, and made my Slack a noisy mess, as it was sending incoming webhook requests every second. One more tweak, which was a super-simple IF-THEN block, set my high-temperature mark at 90 degrees F and let it go.
There is something satisfying about being able to hack something together, cobble it actually, and have it work without blowing up on the terminal, blowing up Slack, or otherwise failing. So now I have a $35 Raspberry Pi running as a rough temperature alarm, it’ll send alerts to Slack and let me and my System Admin know at the same time over Slack. I’m quite happy with how it all worked out. No obnoxious email settings, ports, security frameworks, awkward and obtuse hashing routines, just a single JSON-formatted HTTP call and BAM. All set. An alarm, with a date and time stamp and a temperature, delivered right onto my iPhone with automatic notifications from Slack, so it wakes me up if I need it.
So anyways, without further ado, here is the code:
from gpiozero import CPUTemperature
from time import sleep, strftime, time
import json
import requests# Set the webhook_url to the one provided by Slack when you create the webhook a
t https://my.slack.com/services/new/incoming-webhook/
webhook_url = ‘https://hooks.slack.com/services/####/#####’cpu = CPUTemperature()
def write_temp(temp):
with open(“cpu_temp.csv”, “a”) as log:
log.write(“{0},{1}\n”.format(strftime(“%Y-%m-%d %H:%M:%S”),str(temp)))
if temp > 90:
slack_data = {‘text’: “{0},{1}\n”.format(strftime(“%Y-%m-%d %H:%M:%S”
),str(temp))}
response = requests.post(
webhook_url, data=json.dumps(slack_data),
headers={‘Content-Type’: ‘application/json’}
)
if response.status_code != 200:
raise ValueError(
‘Request to slack returned an error %s, the response is:\n%s’
% (response.status_code, response.text)
)while True:
temp = cpu.temperature
temp = (9.0/5.0 * temp + 32) – 44
write_temp(temp)
sleep(1)
It has been forever since I’ve needed to program anything. Once I was done, and I saw it work the way I wanted it to, I was quite happy with myself. I haven’t felt this particular sense of accomplishment since my college years. It was quite a welcome feeling.
Walking Down Memory Lane
Some notable events from other July 1st’s
2003 – Installed a network aware fax machine, and then attached it to Groupwise. My god, Groupwise. This is such a walk down memory lane! And this of course was the first of a repeated meme that online shared mailboxes at work are upsetting to people because they aren’t “private”, in the same way that a regular fax machine is “private” by hovering over it and muscling out anyone who might try to use it. It of course begs the question, what are you transmitting at work that is “private”, that you shouldn’t be doing at say, a FedEx shop or Office Depot?
2003 – Toppenish, Washington was in the news because a keyword blocker at a library got upset because it found something it didn’t approve of in the text of the domain name itself. Nowadays we don’t search domains for text fragments, we actually categorize them.
2004 – Again with the Fax Machine. In this case, not having long distance on the line requiring the use of an AT&T calling card, with a 60-digit calling sequence just to send a fax far away. And the merry mixups when people who work for an Institution for Higher Learning demonstrate no higher learning by being unable to comprehend digits. Ah, those were the days.
2004 – Farhenheit 9/11 – Hah, those were the days, weren’t they? When it only felt like scandals were rare and maybe all the crazy conspiracy theories were just theories. Oh, the memories.
2006 – Sharing the photos of the bathroom rebuild. It was a long while ago that we tore the guts out of that bathroom and updated it.
2007 – At O’Hare, running through security, on my way to visit family in Syracuse.
2008 – Another trip to Syracuse. This time through Detroit.
2009 – The problem with the cloud is poor security and access points everywhere. What happens when people plant incriminating evidence via a route, like junk mail, that you pay very little attention to – and then make an anonymous tip about the evidence? It was an interesting consideration and helps reinforce how important it is to keep everything digital tidy.
2013 – I wrote a lot of things about the security threat that our very own NSA represents. And little did he know that in 2017, the tools they collected and wrote would leak out and turn into WannaCry ransomware attack. Thanks NSA!
2015 – Facebook Notifications get an enhancement and they can accept a GPG Public Key, so all the Facebook Notifications over email are all encrypted. This was a really good proof-of-concept option from one of the worlds biggest Internet sites, alas it won’t ever take off because GPG is an all-or-nothing technology, and since you aren’t going to have all, all you get is nothing. It was this day that I also gave a lot more thought to The Golden Rule and started to reshape my life around it as a moral compass.
Blogo Test
One of the biggest headaches with my WordPress blog is remembering to write new posts with any frequency. It sometimes comes down to a test of many editors, which one do I like, and how smooth is the learning curve to upload my story to my blog post? Email is a pain, mostly because the instrumentation to add all the extra bits is rather annoying and I don’t really want to revisit and markup a blog entry after I’ve written it. Then after that, I looked at BBEdit, which is the big-brother to TextWrangler. The folks who wrote that application provided a free trial of BBEdit, and gamely informed me that TextWrangler was a dead duck. I never really got engaged with BBEdit enough to think of it as a source for blogging, and TextWrangler is still pretty good for what I need.
Since I’ve had this blog for a long while, and I feel a little bad about neglecting it, perhaps it’s time to dust off the old blogging tools and hop back into it. Facebook is the Walmart of Social Media, it’s everywhere you look, but you feel dirty using it because you know every time you walk in you’re being measured and indexed and categorized.
Facebook, like Walmart, brings a ready audience to the party, people who are happy enough to just waddle through and notice what you have to write and maybe drop a like down. Blogging has always been longer-form than Facebook, and way longer than Twitter. Plus since I host my own blog on my own domain, I can write things here that I wouldn’t or can’t write in other places.
So this test will see how well this little app called Blogo works on my MacBook Pro. If it’s good, we’ll likely have more stories appear in the blog in the future.
DayOne 2.0 to Evernote Migration
Years ago I started to write a personal journal. Mostly the application I used was Microsoft Word, sometimes other text editors, and I’ve always been on the search for a better way to conduct my journaling habit. When I started using Apple’s Mac, I discovered Bloom Built’s DayOne journaling software. DayOne came highly recommended on many pro-Mac websites, so I bought in. Everything was going fine with journaling for a while, and then I noticed that the authors of DayOne were going to release DayOne 2.0. I eagerly jumped onboard with this update and forged ahead. The feature set was welcome, multiple journals, a more refined syncing experience with an online service run by the manufacturer (turned out to be AWS), and I was rather happy. The new syncing system also came with extensions for one of my favorite online time-savers, IFTTT. What more could I ask for?
Then the updates started. I noticed in my Mac App Store application that there was a listed update for DayOne 2.0, and I clicked on it, and the system acted as if there was no update present. A deviation from the expected behavior that I thought was some glitch or bug. So I dived into the problem and searched Google for hints. There were so many options in the Google Index that I figured one of them should solve my problem. In the end, I had completely roto-rootered the entire Mac App Store. I tried one last time, and still, DayOne 2.0 refused to update. I gave up, figuring that was something that maybe a reinstallation of the Operating System would solve because it was a behavior that was unexpected and this sort of thing doesn’t happen with Apple products in my common experience. So then, resistant to being beaten by a bug I forced the issue with the App Store and tried to download the update to DayOne 2.0. I discovered to my chagrin that the update required the next edition of the Mac OS, El Capitan. I have a vested interest in staying with Yosemite; I’m happy with my MacBook using Yosemite, so why should I upgrade the OS to satisfy an application?
The next injury came shortly after that. While using DayOne 2.0, I was rather miserable since the software acted so downright sluggish. I would type, and the application would just pinwheel or pause, and then in a blur, all my words would spill into the display with the same rate at which I type. I wasn’t getting the instant response to keyboard actions that I was expecting. I verified that other applications behaved properly, TextWrangler, for example, behaves perfectly fine to my expectations, so this isn’t a system problem, it’s a DayOne 2.0 problem. Previously to this, I had sprung for a copy of Little Snitch on my Mac to help me better control my network interfaces. Little Snitch had options to block an application from accessing the network. So on a lark, I figured I would test the sluggish application, DayOne 2.0 by blocking its network access with Little Snitch. It was like turning a lightswitch! The sync component was broken, showing a red exclamation mark, but man, text entry was back to normal, and tag entry was super quick as well. I didn’t have to wait for pinwheel after pinwheel to get where I was going. I wanted to journal, to get my text entered into the system for safekeeping and remembering. So for a while, I would use Little Snitch to damage DayOne 2.0 so I could use the application the way I wanted to, the way I expected to. I then wrote to Bloom Built and asked them if they would update the application for the users who didn’t want to march forward with El Capitan or Sierra, and declined. It was a longshot, but I figured it was in their best interest to address their application to the largest group of users, and that would presumably mean even people using Yosemite. It wasn’t to be.
So then after thinking about it for a while, and growing weary of the rather extended procedure to get Little Snitch to help me block DayOne 2.0’s broken sync routines, I made the fateful decision to switch my journaling to Evernote. Why Evernote? Because it was on all my devices, just like DayOne 2.0 (at least Mac devices), and Evernote already had integrations with IFTTT, so that was set. Evernote was something I knew, and the Evernote syncing routines were significantly better than DayOne’s syncing routines. Saying that has to be tempered by the fact that sometimes Evernote’s syncing routines also break, but the one-off hard-to-diagnose sync error is better than a broken sync routine that throws pinwheels when you type text or try to enter tags, as it is with DayOne 2.0. Evernote also has one extra feature, which wasn’t a part of the decision but now that I’ve made the switch, I’m glad for, and that is you can highlight text in Evernote and encrypt it using AES. This is something that DayOne 2.0 had as a promise, but they were by all appearances dragging their heels when it came to journal security.
I then started writing all my new journal entries in Evernote. That was very straightforward. However I left about 11,000 entries behind in DayOne 2.0. I started looking at the ways to get that data out of DayOne 2.0. There are a few options, the creation of text data, PDF data, HTML data, or JSON data. So I started extracting entries out of my DayOne 2.0 journal trying to import them into Evernote. What I wanted was individual entries to move over to Evernote and be individual entries there as well. Everything that comes out of the exporter in DayOne 2.0 comes out as chunks. One big HTML file, one big PDF file, one big JSON file, and one big Text file. There is no easy way to get individual entries out one-at-a-time unless you wanted to manually slog through every single entry. At 11,000 entries, that wasn’t going to happen. I have no patience for that. So then I started to look at ways to hack my DayOne 2.0 exports, since the people that wrote DayOne 2.0 didn’t have anything helpful, and all the other tools I found online were solely written for DayOne 1.0, something I couldn’t use. I didn’t have a Journal.dayone file, I had an AWS hosted JSON chunk. So the hackathon commenced. HTML was a giant headache, since there isn’t any way to easily split HTML up into chunks, syntactically speaking, at least not with the data that DayOne 2.0 exports. The PDF was a mess, one immense PDF and the text was in 8-point, it’d be fine if I was 20 years old, and didn’t mind slogging through a monolithic PDF file for a date. I even tried to hack around JSON in my limited way. I got JSON out to CSV but then realized that my instinct to make the CSV a data source for a mail-merge and mail-merge my journal out to individual entries was going to be a bust. Macs don’t do mail merge at all. I made peace with that a long while ago, not that I ever had any work that needed mail merge. So there was only one format left, the most basic format, text.
DayOne 2.0 spits out a journal into one monolithic text export file. So I have to figure out how to hack this text file up into pieces. I spent a long while with the bash terminal, screwing around with csplit and discovering the subtle differences between Apple’s implementation of csplit and GNU’s implementation of csplit. After a night of blind hacking, I gave up on csplit. Of course, by this time I had also given up on DayOne 2.0, it wasn’t the application I wanted anymore. My feelings had soured against the manufacturer, for only going so far with their export code and leaving the rest for me to hack out on my own. I was irritated and felt gypped that they didn’t just go one step further and include an “export individual entries” checkbox somewhere. But I got over my funk; I burned that bridge there was no reason to keep on complaining about it. I was moving to Evernote and Bloom Built was pretty much post-fire, all sad ashes soaked with water. Nights of searching and hacking on this monolithic text file and I eventually found the solution. The first step comes with Perl:
#!/usr/bin/perl
undef $/;
$_ = <>;
$n = 0;for $match (split(/Date:\t/)) {
open(O, ‘>temp’ . ++$n);
print O $match;
close(O);
}
This little script is something I found through Google. I’m far too lazy to hack this out on my own if I’m brutally honest. The keyword in DayOne 2.0 entries in this monolithic text file is “Date:” followed by a tab character. Every entry starts with this key. So, export my DayOne 2.0 journal to Journal.txt, and then run this script against it: ./split.pl Journal.txt. Perl tears the file into perfect chunks ready for action. But the files are temp001, temp002, temp003, so on and so forth. Two lines then add the last crowning bits to each file. The first tacks on a txt extension and the second one grabs the first line of each file and makes that line the new filename. In DayOne 2.0, the first line is the date line. So now my entries have their dates as their filenames. This is just a compromise, I would have much preferred to have the dates preserved in the file metadata, but hey, you get what you get:
for f in temp*; do mv $f $f.txt;done
for f in temp*; do mv $f “$(head -n 1 $f).txt”;done
So for my test journal, I exported from DayOne 2.0 into Text, chopped it all up using Perl, and used the bash shell to hack the names to where I was happy. Then lasso the entire batch of files and drag them into Evernote. Once I had this done for all my journals, I closed DayOne 2.0 and left it alone. There is no point in trashing it, let it dwell on in a ghostly non-life for all I care. Evernote at least has proper behavior when it comes to text entry, tag entry, and the syncing routines are better. Plus Evernote will never abandon me the way Bloom Built did. They’ll never stop updating Evernote for Yosemite, or if they do, it’ll be so far down the pike that I get a new laptop and all of this is just so much foolish wrangling anyways.
In the end, I won. I won against an annoying choice made by a company I used to love; I won against a file format that seems so dumb, and I was able to shoehorn years of journaling without having to abandon my past or make it so annoyingly inaccessible that it would be the same as abandoning it.
If you find an interest in switching from DayOne 2.0 to Evernote, this is one way to do it. There may be better ways, clever ways to convert the JSON to the Evernote import file format, perhaps. But I didn’t care enough to slog through JSON, this is my way, in all its dumb glory. Of course, my tags in DayOne 2.0 are shot, and the tagging in Evernote is a manual affair, so that was another little compromise. Perhaps if I have dull weekends or evenings, I can hack out the tags over time. Having the entries and losing the tags is an acceptable loss. At least I no longer need to force Little Snitch to break DayOne 2.0 so I can use it. Heh, that’s still something that makes me shake my head in disbelief. That you have to do it this way is such a mess.