Haiku Autosuggest: Keeley

A while ago I was laughing about the sort of silly output that you can expect from Keyboard Autosuggest on iPhones when you give it a subject word to start with. This turned into a free-range idea and I left it alone for a while.

Several nights ago, awaking from a dream that I no longer recollect, it occurred to me quite all of a sudden that I could merge Keyboard Autosuggest with Haiku, the Japanese poetry of 5-7-5.

Obviously this is gobbledegook, but I suppose on some level it is rather funny.

Haiku Autosuggest: Keeley

Keeley thinks it will / take me about an hour more / than a week ago

Going West With Facebook

Much like the elves in Tolkiens tales, sometimes the time is right to board the boats and head west. In this particular case, what to do with Facebook.

I’ve been using Facebook since July 2nd 2008. In the beginning it was wonderful, sharing and everyone seemed kinder, more conscientious, I suppose the world was better back then. Many people were looking for a new platform once LiveJournal collapsed, which if we are really serious about it, came when SixApart was sold to the Russians. Americans fled pretty much after that. And so, Facebook was a thing.

Mostly friends, it hadn’t taken off yet. Many of the later iterations that make Facebook the way it is today weren’t even thought up of back then, and in a lot of ways, it was better in the past. But then everyone started to join the service and we started to learn about the ramifications and consequences of using Facebook. I can remember that feeling of betrayal as Facebook posts were printed out and handed to my workplace management. That really was the first lesson in privacy and the beginning of the end of my involvement with Facebook.

Facebook has been on-again-off-again for a while. In time I realized that I was addicted to the service and the sharing. With enough time I realized that Facebook was actually fit more as a mental illness than an addiction. I had to stop it, because in a very big way, it was the service or my mental health.

So fleeing Facebook is the name of the game. First I downloaded all my content from the service, then I started to move the saved links from Facebook to Pocket for safekeeping. Then I went through and started hacking away at groups, pages, and apps. All of these tasks will be long-tailed, they’ll take a while for me to polish off because Facebooks tentacles run very deep, and in a rather surprising way, just how deep they actually go is remarkable.

So now I’m looking at writing more and sharing more from my Blog. This post is kind of a waypoint to this end. I installed a new theme with some new images featured, and the next step is to figure out a “Members Only” area where I can separate out the public from my friends. There are some items that I intend to write about that use specific names and I don’t want to play the pronoun game with my readers. I also don’t want hurt feelings or C&D notices, both of which some of my writing has created in the past.

I will detail my journey with disposing of Facebook here on this blog. I have eliminated publicity to Twitter and Facebook, but I left G+ on, because G+ is a desert.

So, here we go!

Extracting Cisco Unity 10.5 Voicemails

In my work, I wear many hats. Amongst these is VOIP Manager. It’s not really a job, or a position really but fits neatly under the heading of IT Manager, which is my position title. I oversee the companies Cisco CallManager and Unity systems.

Occaisonally when coworkers of mine leave employment, they sometimes leave behind voicemails in their Unity mailbox. I’ve been searching for a long while to find a convenient method to extract these voicemails out of Unity and into any other format that could be easily moved around so that other people could listen to the recordings and get somewhere with them.

I’ve tried a lot of options, and endless Google searches. I eventually discovered a rather involved method to acquire these messages. This method is something that I would categorize as “bloody hell” because it involves a lot of questionable hacking in order to procure the audio files.

The hack begins with Cisco Disaster Recovery System, known as DRS. If you have a Unity and CallManager system set up, like I do, you probably have already established the DRS and have it pointed somewhere where your backups live. In my case, I have the DRS pointed to a share that lives on my primary file server. So that’s where you start. You have to make sure that DRS is running, and that it generated good backups. This method essentally backdoors the backup system to get at the recordings that Unity takes.

In my Unity folder, I have two days worth of backups, and the files you need specifically are 2018-02-19-20-00-07_CUXN01_drfComponent.xml, and 2018-02-19-20-00-07_CUXN01_CONNECTION_MESSAGES_UNITYMBXDB1_MESSAGES.tar. Your filenames may be slightly different depending on what you named your Unity system. When I found these files, I didn’t even think anything of the XML file, but the tar file attracted my notice. I attempted to copy this to my MacBook and once there, attempted to unpack it with bsdtar. It blew up. As it turns out, Cisco made a fundamental change to DRS after Unity 7, they started encrypting the tar files with a randomized key, derived from the Cluster Security Password. My cluster is very simple, just Unity and CM, and I suppose also Jabber, but Jabber is worthless and so I often times forget it exists. It wouldn’t be that they would use .tar.enc, no, just .tar, which confuses bystanders. That is pretty much the way of things as Cisco, I’ve grown to appreciate.

The next tool you need is from a site called ADHD Tech. Look for their DRS Backup Decrypter. Its a standalone app on Windows and you need it to scan and extract the unencrypted tar data.

The next utility you will need is the DRS Message Fisher. Download that as well. I will say that this app has some rough edges, and one of them is that you absolutely have to run it in Administrator mode, otherwise it won’t function properly.

Start the DRS Message Fisher, select the tar file that has your message archive in it, decrypted, and then you can sort by your users aliases. Click on the right one, then it will present you with a list of all the voicemails the user has in that backup set. You would imagine that selecting all the messages would extract all the voicemails in individual files, but that is not how this application behaves. My experience is that you really should extract one message at a time, because the app dumps its saving folder after every request and cannot understand multiple selections even though you can make multiple selections. It is also eight years old, so that it functions at all is a miracle.

You start at the top, click the first message and then “Extract Message Locally” which should open a window and show you the result WAV file you need. I learned that without Administrator mode, you never ever get that folder, it just opens up your Documents folder and does nothing constructive. In case you need help finding it, look for it here:

C:\Program Files (x86)\Cisco Systems\DRS Message Fisher\TEMPMsgs

With the app in Administrator mode, and a message selected, click the button mentioned above. This will open the TEMPMsgs folder and show you the WAV file. Click and drag this anywhere else to actually save it. Then advance to the next message and extract, and so on and so forth until you have all the messages extracted. There wont be any actual useful data in the filename, it’s just a UUID, so I suppose we should be happy we are getting the audio and count our blessings.

Once you have all the WAV files you need, then you can dump the voicemail account and move on.

What a mess. I look at Cisco and marvel at the configurability of both Call Manager and Unity, but watch it trip embarrassingly hard on things like this. Apparently nobody ever cared that much to address voicemail survivability and extraction. As far as I know, this overwrought and wretched procedure is required to meet that particular need. It goes without saying, this is wholly and completely unsupported by Cisco TAC, so save yourself the headache of running full-speed into that bulkhead.

In many ways, this solution, if you can call it that, is getting voicemail survivability by dumpster diving. Ah well, it’s Cisco, complaining is very much like going down to the river and screaming at it to change course. You’d probably get further with the river.

Percentile Taxation and Citizenship

The media is awash in talk about socialized healthcare, taxation, and immigration. I do not claim to be an expert in any of this, and probably something like this would not work out, but on a lark, I started to cast about in a kind of brainstorming session about how I might solve the taxation problem and the issues surrounding citizenship.

The opening gambit is socialized healthcare, also known as Single Payer. Let’s just call it healthcare moving forward as a shortcut for what we’re really talking about. The next series of moves are crafted as a kind of chess game, with different pieces being different cliched arguments:

  • We don’t want the poorest to suffer and die, it doesn’t conform to the moral standard of the three faiths, so we must act.
  • It is very expensive for some, and thoughtlessly free for others. The old are expensive, the young are not, and there are outliers everywhere.
  • The government is already up to its neck in debt, how can we saddle ourselves with more?
  • Nobody can escape any of the above points. Otherwise, they will appear to be a hypocrite.

The challenge of healthcare is how to pay for it. Healthcare is rather expensive at the start, but in the long-term, it is actually cheaper than what we have right now. How can we afford such a thing as a society and also keep many of the other services that we have come to expect from our government? The best answer, the most common one is to reformulate taxation.

Taxation

It seems as the tax code is the most complicated subject in all of government. We keep on making attempts to address what is fair and just and depending on the political winds, it changes from generation to generation. I am not going to make any claims for practicality, this is brainstorming, not policy.

There have been many plans over the years. Flat taxes, graduated taxes, and many economic theories such as trickle-down economics that has been featured while I’ve been alive, since 1975. This plan is just another possibility, and I don’t know if it would actually work out, but it was the first thing I thought of, which launched this blog post.

How about a taxation plan based on percentiles? You take all citizens that are not disabled, you list out all their incomes, something for which already exists in the IRS. Then you order everyone from smallest income to highest income. We will dispense with all the tax loopholes, and resolve to simplify everything down to raw income. For businesses, we will do the same, their profits ranked from smallest to largest. Then from there, we calculate the percentile rank across the entire gamut for both classes of entity, people, and corporations. Those at the bottom pay next to no tax, while those at the very top pay almost all tax. The percentile rank makes calculating where you sit rather easy. The IRS can calculate this value and send out a postcard letting you know. Since there are no more loopholes, there is no more need for complicated forms and instructions. The withholding is done by employers, the IRS settles all accounts, and every April 15th you either get a bill or a check.

Everything with taxation is wrapped up with politics. Because of this, and because of politics, there are counter-arguments for and against any sort of change to taxation. The most common retort to a change in taxation like this, where the rich would have to pay an exceptionally high tax, is the argument that they would just leave the country to avoid the tax. So then we come to the next section…

Citizenship

We are all citizens of the United States of America. Many of us acquired that citizenship by nativity. We didn’t do anything to earn or deserve it other than have the luck to be born in the right place at the right time. Currently, citizenship and immigration is a hot-button issue. Many people want to come to the United States, and so over time, we have started to reduce and control immigration to our country. Very recently, I have noticed a rather unpleasant nationalistic nativism which is adding new discrimination to this process. We aren’t holding the lamp by the golden door, as much as we want to search a line-up and cherry pick the very best to join our country.

Citizenship provides rights, privileges, and abilities that people without US Citizenship may not have. We have started to covet this citizenship both economically and culturally. It is something we have, and something we want to keep to ourselves. What lies at the heart of citizenship? We are all part of a greater whole, the dream of America, and we get immense benefits from that, and so we must meet the cost by paying taxes. Taxes pay for citizenship and civilization. If you want to play, you have to pay. Also, if you want to pay, you can play.

Those that wish to immigrate to the United States should be willing to agree to our taxation. Those that do not agree with our taxation should be excluded from citizenship. They are unwilling to pay for it, so why should they take advantage of it? So, when it comes to the Dreamers, they are all paying taxes so they can be citizens. If someone who is exceptionally rich doesn’t want to pay taxes, they can abdicate their responsibilities to society at the cost of their citizenship. They can, of course, re-acquire the citizenship as easily as anyone else, by agreeing to pay taxes based on their income.

Final Thoughts

I don’t really suppose any of this would be actually practical, but amidst all the arguments currently being discussed, why not at least touch on these ideas? In the current political climate, there is an exceptional number of interested parties, and the quality of discourse is more varied than it has ever been before. I’m sure if anyone reads this post, they will have strong responses, and I welcome the commentary, but I reserve the right not to respond if there is no point to it. As I said before, this is not policy, this is brainstorming. Please keep that in mind if you are upset.

Social Media Tent Flapping

I seem to vacillate between social media platforms these days. Since the collapse of my beloved Imzy, I’ve been kind of lurking about Facebook for a while. Facebook is rather unpleasant to use, mostly because the commentary is so awful. The only people to really blame are folk, the people who are on the platform aren’t really interested in communication, just trolling. So I’ve been looking back to Google Plus, and while the posts are still flowing there, and things are more intellectual on Google Plus, there’s no audience.

Which brings me to my blog. I damn near forgot it existed and then I discovered that it’s been down for probably the last five months because of file permission errors on the host that I use, iPage.com. Once I was able to correct the issue, the blog came back and with it some of the tools I use, like Blogo to write these posts in a convenient manner.

I also admit that moving to the Bear app has got me writing again in my journal, which I think is a really good thing. It appears that it may have spilled over into more activity for my blog. So if I’m paying for this hosting, I might as well use it.

I’d like to say there will be a steady stream of new articles. HAHAHAHAHAHAHAH. We’ll see about that. Maybe, maybe not.

Journal Migration

Just as I had migrated from Day One to Evernote, I got really tired of Evernote and it’s bloated sluggishness. So I moved my Journal again to an app platform linked to the Bear app on iOS. This application is really quite useful and a joy to use on my Macbook. I tried to sync my journal to my iOS devices, but I had less luck with that. I am however getting pretty top-notch support from the people who write the app, so for that, it’s working out really well. I can use the platform with hope of a app fix for what ails my journal, as long as that hope lasts.

One of the most compelling parts of the Bear app is its tagging platform. It’s almost the perfect thing, but as I wrote to Evernote as a enhancement request, I would like tags to be optionally indexes as well. What I mean by this is when I make a tag, that there be an optional checkbox or slider for making it an index entry. So when I create a new index entry, the software scans the content of my journal for that tag and if it finds it, adds the tag to the entry. So far, I haven’t found any apps that do that and sync across devices. But so far, Bear is nice to work with. If you are interested in seeing what it is like, you can get it for free from the Mac App Store.

Giving Chrome Some Pep

I’ve been using Google Chrome on my Macbook Pro for a long while, and I’ve noticed that some websites take some time to get moving along. In some ways, it feels like the browser is panting and trying to catch its breath. So today, while trying to solve a work problem I accidentally stumbled over a neat way to give my Chrome browser a little bit of a boost in performance. It seems to benefit when I use sites that are interactive, like my work help desk site or PNC online banking for example.

The trick is, create a small RAM drive on the system, and then copy the Chrome profile over, link to that profile so Chrome can find it, and then start to use Chrome. As Chrome works, things like settings and cache data go to RAM instead of the HD on my MacBook Pro. Then I use rsync to copy data into a backup folder just in case my MacBook pro suffers a kernel panic or something else that would accidentally dump the RAM drive.

There are a few pieces to this, mostly from scripts I copied off the network.

I copied the script called mount-tmp.sh and made only a few small adjustments. Specifically changed the maximum RAM drive size to 512MB.

Then I created two different bash scripts to check-in the profile to the RAM drive and then to check-out the profile from the RAM drive back to the HD. Since I wrote them from scratch, here they are:

check-in.sh


#!/bin/bash
/Users/andy/mount_tmp.sh
mv /Users/andy/Library/Application\ Support/Google/Chrome/Default ~/tmp
ln -s /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome/Default
rsync -avp -delete /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome/Default_BACKUP
echo “Complete.”

check-out.sh


#!/bin/bash
rsync -avp -delete /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome/Default_BACKUP
rm /Users/andy/Library/Application\ Support/Google/Chrome/Default
mv /Users/andy/tmp/Default /Users/andy/Library/Application\ Support/Google/Chrome
/Users/andy/mount_tmp.sh umount
echo “Complete.”

If you give this a shot as well, I would love to hear from you about your experiences with this little speed improvement hack! Hope you enjoy!

Moment of Geek: Raspberry Pi as Thermal Canary

A few days ago I had run into a problem at work. The small Mitsubishi Air Conditioner had decided to take a cooling nap in the middle of the day. So my office, which is also the machine room at work was up around 85 degrees Fahrenheit. I was used to this sort of thing, summers bringing primary cooling systems to their knees, but this time I had a huge A/C unit in the ceiling that I elected not to have removed and left in place, just in case. So I turned it on, set it’s thermal controller to 70 degrees and the room temperature tumbled in about ten minutes. Right after the room temperature was normal, and I had service out to visit me about my little wall-mounted A/C unit, the damn thing started functioning normally again. The tables turned on IT, where for our users, this is what happens to them. They can sit there and struggle, and then we arrive and the machines behave themselves like nothing at all was wrong.

So I had the big A/C, and it’s smaller wall-mounted unit both running overnight and faced a problem. I want to know what the temperature is in my machine room without having to buy a TempPageR device. I had one long ago, and it was rather expensive. I looked on my desk and noticed my Raspberry Pi, just sitting there, doing nothing of consequence. I did a brief cursory search on Google, and I knew the Raspberry Pi had a CPU Temperature interface hidden somewhere, and I was happily surprised to find a website detailing how to use this exact feature in Python programming language to write a temperature log, and optionally graph it. It was mostly copypasta, adapting things I had found online pretty much by copy and paste and hammering them here and there to work. I have programming skills, but they are rather dated and rusty. Plus I’ve never used Python, specifically. So my first effort was successful, I got a 1-second temperature logger in place. I was rather happily satisfied with my efforts, but I knew I would not be happy with Celsius, but I knew the temperature was colored by the CPU in the Raspberry Pi itself, so the reported temperature was quite higher than the room temperature.
I started to tinker. First searching for the equation to convert C into F. So I got it, 115 degrees. When I turned on the big A/C device, and its thermal controller displayed the ambient room temperature in F, 74. So I did some math and subtracted a constant 44 degrees from the CPU temperature, which “calibrated” the CPU temperature to be a rough approximation to the room temperature. Some eagle-eyed readers may notice that my math is off, but after I had moved the Pi over to the server stack, I had to adjust for a higher CPU temperature because of it being further away from the wall A/C unit. So now I had a 1-second temperature logger. I turned on graphing, and the entire program crashed and burned, I wasn’t running the application in an X-Windows environment, so I tore the graphing library and code out because I was never going to use the graphing feature anyways.

That, of course, was not enough to replace the TempPageR device. I needed some alarm system to alert me to what was going on. I thought of some interfaces, email, SMS, iMessage, email-to-telephone-call cleverness and each thought brought me against different versions of the cliffs of insanity. I could have probably smashed and hacked my way to a solution involving some ghastly labyrinth of security settings, passwords hashed with special algorithms that are only available on ENIAC computer simulators that only run on virtualized Intel 8086 processors with the Slovenian language pack loaded and using the Cyrillic character set; An arrangement that was an epic pain in the ass. So earlier in the day, I had tripped over an app advertisement for Slack so that it could use incoming data from the Pingometer website. I have a Pingometer account, a free one because I’m a cheap bastard. The single pinger externally checks my fiber optic connection at work, keeping AT&T on their toes when it comes to outages. The Pingometer website uses incoming Slack webhooks. An incoming Slack webhook comes from some source that makes a really simple web browser call using HTTP. It wraps JSON into HTTP and sends the request to Slacks servers. Slack then does everything needed to make sure the message is pretty and ends up on the right Slack channel, on the right team; this was my alert mechanism.

So I did another Google search, found the intersection between Linux, Python, and Slack and some more copypasta and some tinkering and I had a Python app that displayed the room temperature in Degrees F, and made my Slack a noisy mess, as it was sending incoming webhook requests every second. One more tweak, which was a super-simple IF-THEN block, set my high-temperature mark at 90 degrees F and let it go.

 

There is something satisfying about being able to hack something together, cobble it actually, and have it work without blowing up on the terminal, blowing up Slack, or otherwise failing. So now I have a $35 Raspberry Pi running as a rough temperature alarm, it’ll send alerts to Slack and let me and my System Admin know at the same time over Slack. I’m quite happy with how it all worked out. No obnoxious email settings, ports, security frameworks, awkward and obtuse hashing routines, just a single JSON-formatted HTTP call and BAM. All set. An alarm, with a date and time stamp and a temperature, delivered right onto my iPhone with automatic notifications from Slack, so it wakes me up if I need it.

So anyways, without further ado, here is the code:


from gpiozero import CPUTemperature
from time import sleep, strftime, time
import json
import requests

# Set the webhook_url to the one provided by Slack when you create the webhook a
t https://my.slack.com/services/new/incoming-webhook/
webhook_url = ‘https://hooks.slack.com/services/####/#####’

cpu = CPUTemperature()

def write_temp(temp):
with open(“cpu_temp.csv”, “a”) as log:
log.write(“{0},{1}\n”.format(strftime(“%Y-%m-%d %H:%M:%S”),str(temp)))
if temp > 90:
slack_data = {‘text’: “{0},{1}\n”.format(strftime(“%Y-%m-%d %H:%M:%S”
),str(temp))}
response = requests.post(
webhook_url, data=json.dumps(slack_data),
headers={‘Content-Type’: ‘application/json’}
)
if response.status_code != 200:
raise ValueError(
‘Request to slack returned an error %s, the response is:\n%s’
% (response.status_code, response.text)
)

while True:
temp = cpu.temperature
temp = (9.0/5.0 * temp + 32) – 44
write_temp(temp)
sleep(1)


It has been forever since I’ve needed to program anything. Once I was done, and I saw it work the way I wanted it to, I was quite happy with myself. I haven’t felt this particular sense of accomplishment since my college years. It was quite a welcome feeling.