Vexatious Microsoft

Microsoft never ceases to bring the SMH. Today I attempted to update a driver for a Canon 6055 copier here at the office. The driver I had was a dead duck, so out to get the “handy dandy UFR II driver”. I downloaded it, noted that it was for 64-bit Windows 2012 R2 server and selected it. Then I went to save it, and this is the error that greets me:

Capture
“Printer Properties – Printer settings could not be saved. This operation is not supported.”

So, what the hell does this mean? Suddenly the best and the brightest that Microsoft has to offer cannot save printer settings, and saving of printer settings is an operation that is not supported. Now step back and think about that for a second, saving your settings is not supported.

The error is not wrong, but it is massively misleading. The error doesn’t come from the print driver system but rather from the print sharing system. That there is no indication of that is just sauce for the goose. What’s the fix? You have to unshare the printer on the server, and then update the driver, and then reshare the printer. The path is quick, just uncheck the option to share from the neighboring tab, go back, set your new driver, then turn sharing back on. It’s an easy fix however because the error is not written properly, you don’t know where to go to address it. A more elegant system would either tell you to disable sharing before changing drivers or because you are already sharing and trying to install a new driver, programmatically unshare, save the driver, then reshare. Hide all of this from the administrator, as you do. That’s not what Microsoft does; they do awkward and poorly stated errors leading you on a wild goose chase.

But now I know, so that’s half the battle right there. Dumb, Microsoft. So Dumb.

Network Monitoring

I’m in the middle of a rather protracted evaluation of network infrastructure monitoring software. I’ve started looking at Paessler’s PRTG, also SolarWinds Orion product and in January I’ll be looking at Ipswitch’s products.

I also started looking at Nagios and Cacti. That’s where the fun-house mirrors start. The first big hurdle is no cost vs. cost. The commercial products mentioned before are rather pricey while Nagios and Cacti are GPL, and open sourced, principally available for no cost.

With PRTG, it was an engaging evaluation however I ran into one of the first catch-22’s with network monitoring software, that Symantec Endpoint Protection considers network scanning to be provocative, and so the uneducated SEP client blocks the poller because it believes it to be a network scanner. I ran into a bit of a headache with PRTG as the web client didn’t register changes as I expected. One of the things that I have come to understand about the cost-model network products is that each one of them appears to have a custom approach to licensing. Each company approaches it differently. PRTG is based on individual sensor, Orion is based on buckets, and I can’t readily recall Ipswitches design, but I think it was based on nodes.

Many of these products seem to throw darts at the wall when it comes to their products, sometimes hit and sometimes miss. PRTG was okay, it created a bumper crop of useless alarms, Solarwinds Orion has an exceptionally annoying network discovery routine, and I haven’t uncorked Ipswitch’s product yet.

I don’t know if I want to pay for this sort of product. Also, it seems that this is one of those arrangements that if I bite on a particular product, I’ll be on a per-year budget cost treadmill for as long as I use the product unless I try the no-cost options.

This project may launch a new blog series, or not, depending on how things turn out. Looking online didn’t pan out very much. There is somewhat of a religious holy war surrounding these products. Some people champion the GPL products; other people push the solution they went with when they first decided on a product. It’s funny but now that I care about the network, I’m coming to the party rather late. At least, I don’t have to worry about the hot slag of “alpha revision software” and much of the provider space seems quite mature.

I really would like anyone who works in the IT industry to please comment with your thoughts and feelings about this category if you have any recommendations or experiences. I’m keenly aware of what I call “show-stopper” issues.

Archiving and Learning New Things

As a part of the computing overhaul at my company, each particular workstation that we overhauled had its user profile extracted. This profile contains documents, downloaded files, anything on the Desktop, that sort of information. There never really was any centralized storage until I brought a lot of it to life, later on, so many of these profiles are rather heavy with user data. They range all the way up to about 144 gigabytes each. This user data primarily just serves as a backup, so while it’s not essential for the operation of the company, I want to keep as much as I can for long-term storage and maximally compress it.

The process started with setting up an Ubuntu server on my new VMWare Host and giving it a lot of RAM to use. Once the Ubuntu server was established, which on its own took a whole five minutes to install, I found a version of the self-professed “best compression software around” 7zip and got that installed on the virtual Ubuntu server. Then I did some light reading on 7zip and the general rule of thumb appears to be “throw as much as you can at it and it will compress better”, so I maxed out the application with word size, dictionary size, the works. Then started to compress folders containing all the profile data that I had backed up earlier. Throwing 144 gigabytes of data at 7zip when it’s maxed out takes a really long time. Then I noticed the older VMWare cluster and realized that nothing was running on that so for its swan song I set up another Ubuntu server and duplicated the settings from the first one on the second one and pressed that into service as well.

I then thought about notification on my phone when the compression routine was done, but by the time I had thought about it, I had already started the 7zip compressor on both servers. Both of these were far enough along where I didn’t want to cancel either operation and lose the progress I had made compressing all these user profiles. I am not a Bash Shell expert so it took a little digging around to find that there already was a way, temporarily, to freeze an application and insert more commands after it so that when the first application completes, the next application will go immediately into operation. You use Control-Z, which freezes the application and then the command “bg %1 ; wait %1 ; extra command”. Then I thought about how I’d like to be notified and dug around for some sort of email method. None of these servers that I put together had anything at all in the way of email servers and I really wasn’t keen on screwing around with postfix or sendmail. I discovered a utility called ssmtp which did the trick. Once I configured it for use with my workplace Office365 account and did some testing, I had just the thing that I was looking for. I stopped the application on both servers doing the compression and inserted the email utility to the end of the application finishing. When the compression is done, I will be emailed.

All in all, quite nifty and it only took a few minutes to set up. Once I’m done with this particular task, I can eliminate the “junky” Ubuntu server altogether on the old VMWare host and trim back the Ubuntu server running on my new VMWare host. I quite love Ubuntu, it’s quick and easy, set up what you want, tear it down when you don’t need it anymore, or put the VMWare guest on ice as an appliance until you do need it sometime later. Very handy. Not having to worry about paying for it or licensing it is about as refreshing as it can get. I just need something to work a temporary job, not a permanent solution. Although considering how much malware is out there, the breakpoint between the difficulty-to-use for end users in Linux may eventually give way to the remarkable computing safety of using Linux as a primary user workstation operating system. There is still a long while before Linux is ready for end-user primetime. I sometimes wonder what it will take for the endless vulnerabilities of Windows to break Microsoft. Hope springs eternal!

Trials

A major Fortune 500 company has a world-renowned hiring trial for their new IT staff. There are all the usuals, the resumes, the interviews, but there is also a fully funded practical trial as part of the job application process. The job itself is cherry, practically autonomous, with real challenges and true financial backing so the winner can dig in and achieve serious results.

The trial is rather straightforward, given a property address, you must approach, perform an intake procedure to discover what is required and then plan and execute whatever is needed to solve the IT need.

The property has one person, a newly hired young woman who is sitting at a central desk on the ground floor. She has a folder, within it, a script that she reads to each candidate:

“Welcome to your trial, this building has everything required to run a branch of our company. Every computer, networking component, and server component is placed and wired properly. Your task is to configure all the equipment throughout the branch properly. You will find all the resources you need to complete this task within the building. You have one week to complete this task. Good Luck.”

The young woman then folds her hands together and waits.

Several candidates engage with the trial, hoping to get the cherry job and have learned about the young lady at the reception desk. They pass all the requirements, and they eagerly arrive to try their hand at the trial. They impatiently sit through her canned speech and quickly head off to the basement to start in the server room.

Candidates come and go, some pass and some fail. The trial is to get the branch fully operational and on the last day of the week the branch becomes staffed, and the candidate must ensure that all the preparations are in place and that everyone can work without a technological failure. The trial is winnable but very arduous.

The young lady sitting at the central desk on the ground floor has a secret. She has a shoebox locked in a drawer attached to her desk and around her neck is a key on a golden necklace. She has specific instructions, which if a candidate approaches her and engages pleasantly and shows sincere interest in her role in the branch without being the destination of a last-ditch effort, she is to pause the conversation, unlock the desk and produce the shoebox to the candidate. Within the shoebox is the answer to the trial, it is every specific requirement written in clear, actionable text with a memory stick containing every proper configuration and a full procedure list that will bring the branch to full operation without a single hiccup. Everything from networking configurations to the copier codes for the janitorial staff is covered and once executed virtually guarantees a win.

How many people would simply ignore the receptionist and get cracking on the trial and how many would take their time to get to know everyone and their roles in that particular branch? Either kind of candidate can win, either through a sheer act of will or simply being kind, careful, and honestly interested in the welfare of each of their coworkers. Nobody knows about the secret key, but sometimes the answer you need comes from a place you would never expect.

Peer to Peer File Transfer, Reep.io

I recently needed to move about ten gigabytes of data from me to a friend and we used a new website service called reep.io. It’s quite a neat solution. It relies on a technology that has exists in many modern browsers, like Chrome, Firefox, and Opera called WebRTC.

The usual way to move such a large set of data from one place to another would probably best be mailing a USB memory stick or waiting to get together and then just sneaker-net the files from one place to another. The issue with a lot of online services that enable people to transfer files like this is that many of them are limited. Most of the online offerings cap out at around two gigabytes and then ask you to register either for a paid or free account to transfer more data. Services like Dropbox exist, but you need the storage space to create that public link to hand to your friend so they can download the data, plus it occupies the limited space in your Dropbox. With reep.io, there is no middleman. There are no limits. It’s browser to browser and secured by TLS. Is that a good thing? It’s better than nothing. The reason I don’t like any of the other services, even the free-to-use-please-register sites is because there is always this middleman irritation in the way, it’s inconvenient. Always having to be careful not to blow the limit on the transfer, or if it’s a large transfer like ten gigabytes, chopping up the data into whatever bite-sized chunk the service arbitrarily demands is very annoying.

To use this site, it’s dead simple. Visit reep.io, and then either click and drag the file you want to share or click on the File Add icon area to bring up a file open dialog box and find the file you want to share. Once set, the site generates a link that you can then send to anyone you wish to engage with a peer-to-peer file exchange. As long as you leave your browser running, the exchange will always work with that particular link. You don’t need any extra applications, and it works across platforms, so a Windows peer can send a file to a Mac client, for example. That there is no size limit is a huge value right there.

If you have a folder you want to share, you can ZIP it up and share that file. It’s easy to use, and because there are no middlemen, there aren’t any accounts to create, and thanks to TLS, nobody peeping over your shoulder.

Shifting Platforms

I go through cycles of having an interest, and then not having an interest in social media. Twitter and Facebook are the core services that I’m thinking about here. Amongst these services, I’ve given up on Twitter. I no longer engage with anyone in Twitter and the leading edge of loud, noisy chatter has carried on without me. If I do run the Twitter application, it’s mostly to witness some event as it unfolds, like a news source, or to jump on some shame bandwagon when a public figure makes a terrible mess of their lives by saying or doing something stupid.

I am about to give up on Facebook as well. There are many reasons for this renewed effort to leave the system. I am tired of the see-saw polarity between stories. The negative political stories mixed in with the positive reaffirming stories build up a kind of internal mental noise that clouds my day and keeps me from being focused. Another reason to leave is the interface has become somewhat moribund on its own. You can sometimes comment, sometimes not. The only option to express your reactions when it comes to feelings is “Like” and the entire service has become self-balkanized. I have friends and family on Facebook, but out of all of them, I only follow a few and I’ve muted the rest. I don’t really miss the engagement, but always having to think about tailoring my thoughts based on the audience has started to give me fatigue.

I think then that it may be time for me to go back to writing blog posts on my WordPress blog. The blog encourages longer format writing, and I expect that engagement will drop as I won’t be using Facebook. In a lot of ways, it is a kind of social addiction and the only way to break it is to wean off of it. Perhaps cold turkey is not right, but rather cool turkey.

I don’t expect anyone to follow me off of Facebook. I will share my blog posts to Facebook so people can still see what I write, but the engagement will drop off. Feel free to comment on my blog if you wish. Otherwise, that will be that.

On a more technical note, I changed how the stories are shared across systems. The original way was to publish a WordPress entry, which would share to Tumblr, and that would then share to Twitter and Facebook. I have torn that down and set it so that WordPress itself shares to Facebook, Google Plus, Tumblr, and Twitter. It’s a more direct path that doesn’t require people to slog through my Tumblr. I think it’s more direct this way.

Thanksgiving 2015

Tis the season for us to unpack all the holiday crazy that comes with the post-Halloween holiday adventure. Thanksgiving and Christmas. Cooking, planning, setting up, and a lot of decking of the halls!

So we start with Thanksgiving. Weeks ago we took advantage of the 50% discount deal at our local supermarket and made room for the frozen Turkey in our basement fridge. Then we slowly accumulated all the other ingredients to our “feeding an army for two people” style of Thanksgiving. On that Monday, November 23rd. I caught a little video from a television and network cooking personality, Mr. Alton Brown. He recommended that people could defrost and brine a turkey at the same time. So I had a frozen Turkey in my freezer and I had never brined a Turkey before and didn’t know how it would turn out. Following Mr. Browns advice, I hauled out the twenty-pound bird and found that my biggest stock pot fit it like a glove. The directions couldn’t have been more direct and simple. Strip the Turkey of it’s webbing and plastic wrap, then put a cup of Kosher Salt in the vessel along with 2L of hot tap water in the vessel and stir until the salt is dissolved. Then add 4L more cold water to the vessel and then put the turkey in. I put it so that the main cavity was pointed up at me, so as I added more water (water to fill all the way around the turkey) it wasn’t going into the cavity, so I poured into the cavity until the entire bird was submerged. Then I wrapped the top in plastic wrap and put it in the basement, behind locked doors. No refrigeration required! As the turkey defrosted itself, it also brined itself. When I temped out the bird two days later it was at about 45 degrees and then I stowed it in the fridge until we were ready to cook it. When I was set, I poured the water off and then rinsed it with fresh cold tap water, all the cavities and everything. Then I put it in the roasting pan.

The oven was set at 350 degrees, however, it was running hot for about twenty minutes, so the first shot was at about 400 degrees. I knew something wasn’t right because the turkey was making a lot of snap, crackle, and pop noises. When I checked the temperature I noticed the temperature disparity and corrected the dial, which brought the oven back into calibration.

There were two competing schools of thought during the cooking process. The first one was that I had accidentally turned our turkey into Lot’s Turkey, a solid pillar of salt. The other school was “it defrosted and it didn’t amount to crap.” and that the salt was pretty much just a silly affectation. I held out hope, mostly because of the sage words of Mr. Brown, whom I trust when it comes to food preparation and cooking.

We were a little taken aback when the temperature probe indicated that every part of the turkey had reached about 170 degrees, it was well and truly done. I asked, “How much juice is in the pan?” and the answer was “Not very much, if any. Only what it was basted with.” We had made enough of our own with the basting juices made with turkey broth concentrate and sauteeing the neck. I let the turkey settle for about ten minutes and then carved into it.

The meat was so moist and juicy that it fell apart as I carved into it. The entire dinner was spent marvelling at just how amazing it all was and how we’ll never do a turkey any other way than this. So simple, a saltwater bath for three days changes so much about a turkey! And just like Mr. Brown promised, the brine really shines for leftovers. The turkey is usually tough and dry as cardboard by the time its leftovers, but with the brined turkey it is nearly as amazing each time we take little out of the fridge for dinner it’s still amazing!

I can’t understand why everyone doesn’t brine their turkey. We’ll brine ours from now on, fresh and leftovers are just the tip of how amazing this is. The turkey probably was fully thawed in a little over a day! The three days just added to the brine’s power to make the bird juicy and amazingly flavorful.

Just for the record, the turkey wasn’t related to Lot at all, it wasn’t salty. It was amazing.

Weak Certificates

I’ve got an odd little problem at work. I’ve got a Ricoh copier in the Traverse City office that I apparently now can no longer manage remotely due to an error in SSL. The error that Firefox throws is ssl_error_weak_server_cert_key and in Google Chrome it’s ERR_SSL_WEAK_SERVER_EPHEMERAL_DH_KEY. In both situations I understand what the issue is, that the SSL layer is weak because the Diffie-Hellman key is not big enough.

I’ve run into this issue before, mostly with self-signed certs and the browsers have usually allowed me to click on an exception and get on with my day. Except for Firefox and Chrome now, that is no longer the case. The browsers just refuse to display the webpage. I understand the logic behind it, everyone wants a more secure web, but sometimes what we are really after isn’t privacy or security, but rather just getting our work done.

I still need to connect to this copier and manage it, and frankly my dear, I don’t really care that much that the transactions be secure. In a way, this security is irrelevant. The traffic on our WAN is flowing over a Meraki VPN site-to-site link, so it’s already secure. This is security on top of security, and it’s in the way.

So I thought about using the awful Internet Explorer for this and I chafe at even considering using one more wretched bit of Microsoft technology – there has to be a better solution. So when you run into little bits like this the best way forward is to pursue my favorite solution, heterogenous computing! There’s more than one way to get what you are after. So if Firefox and Chrome won’t work, and Internet Explorer is unthinkable, how about Opera?

So I downloaded Opera and installed it. Then browsed to my copier in Traverse City. Opera told me about the error, but it also provided me with an exception button and then once I clicked that, the error was bypassed and my copiers remote management screen appeared.

So now I’ll add Opera to all the other browsers I have on my computers. The answer is competition. I wonder sometimes if there isn’t a special browser out there for IT type people like me. They’ll render anything, ignore any “privacy or security” type errors, all so people like me can get our jobs done. For now, Opera seems to lead the pack, at least for this. Thank you Opera!

Killing SpotifyWebHelper

I’ve had a problem with Spotify for a while now on my Mac. The damn program opens up spontaneously all by itself unbidden. What’s really annoying is that it also frequently auto-starts and auto-plays tracks I didn’t want to play.

I found out that when I start my Mac, or start Spotify itself, there is another application which is automatically started called SpotifyWebHelper.

I’ve noticed that when I go into Activity Monitor and kill this app, the unwanted automatic start and play problem goes away. That’s good, but it’s not really the answer. The answer is to murder SpotifyWebHelper.

So I turned to the CLI, you can issue the command killall spotifywebhelper and press enter. That does kill it, but what I want is to prevent it from ever being run. So I unloaded it from launchctl and deleted it’s LaunchAgent .plist file. When Spotify starts, it puts it all back.

Then I went where SpotifyWebHelper is located and renamed it. Spotify repairs this as well. Then I tried to set the SpotifyWebHelper application in ~/Library/Application Support/Spotify so that it had no posix rights whatsoever by chmod a-rwx SpotifyWebHelper. The next time you run Spotify, it fixes it all by itself.

This is less of a feature and more of a virus. A zombie virus, you just can’t kill it.

But I have killed it for good, and here is how to be free of SpotifyWebHelper:

  1. Quit Spotify
  2. Open Terminal, killall SpotifyWebHelper
  3. cd ~/Library/Application Support/Spotify
  4. rm SpotifyWebHelper
  5. cd ..
  6. chmod a-w Spotify
  7. Close Terminal, done!

After that, you will be free of the horrible SpotifyWebHelper bullshit and Spotify won’t automatically run and play things you don’t want it to.

HP Pavilion Boot Loop Problem

Yesterday I ran into a devil of a time with a HP Pavilion slimline workstation at work. This machine was beyond it’s warranty with HP, so no help from them. I had a machine that presented these symptoms:

  • Computer powers up normally.
  • All BIOS-level diagnostics pass.
  • No error codes or beep codes whatsoever.
  • Once the HP BIOS Splash screen fades, the computer should boot into Windows. In this case, Windows 7. It does not. The computer reboots into the HP BIOS Splash screen. Ad infinitum.
  • You can enter BIOS Setup, you can also access the Boot Menu to select other boot sources, however the F11 key to start System Restore is unresponsive.
  • All first-tier efforts to clear the error were taken. BIOS reset to factory conditions, as well as holding down the power button to clear the power supply controller. None of these resolved the issue.

I then plugged in a copy of Knoppix that I downloaded and installed on a USB memory stick. I could have also burned the ISO file to a DVD and used that as well, but the USB was handy. When I use Knoppix this way, I like to enter this “Knoppix Startup Cheatcode” into the prompt right after it boots: “knoppix 2” (without quotes, of course) and this starts the Knoppix system in  the INIT 2 run level, which is single-mode text only interface. I don’t need X-Windows, and in this case, that just gets in the way.

Once at the CLI for Knoppix, I figured the boot flag, the boot manager, or the MBR was shot for the primary partition on the hard drive in the machine. Diagnostics indicated that the primary hard drive was fine, so it wasn’t a physical failure in the HD. I knew that the first (and only) hard drive in systems like these were most likely /dev/sda, you could search the “dmesg” log if you have doubt on where in the /dev the primary hard drive is. Knoppix has the “fdisk” command, so that was my next stop. I knew that this particular HP machine had a Windows Recovery partition stuffed in it, so when I started “fdisk” I displayed the partition map and there were three partitions: /dev/sda1, /dev/sda2, and /dev/sda4. I looked at the sizes and figured that the biggest one was the damaged partition, the middle one was probably for swap or scratch or something, and the last one seemed sized properly for the recovery partition. Honestly it was a guess. I turned the bootable flag on for /dev/sda4 and then off for /dev/sda1, then wrote the partition map to disk and then issued the command “shutdown -r now” to reboot out of Knoppix. Technically you could have just unplugged the machine, but I’m a big fan of orderly shutdowns even when the consequences are irrelevant – it’s a good habit to have.

The machine booted to the HP BIOS Splash screen, and then Windows Recovery started. Once the recovery partition got going I noticed a cutesy HP menu appeared offering me a selection of options. I started out with the simplest option which was something like “Microsoft Windows Boot Recovery” and it ran for maybe a second and then offered to reboot. I went for the reboot and that fixed the issue. Windows started but instead of a regular startup it went to the recovery menu, which I found fine since that was where I was going to go anyways by pounding the F8 button like a madman. I selected “Safe Mode With Networking” and then plugged in my USB memory stick containing TRON and got TRON working on the system.

Once TRON was done, I rebooted and let chkdsk naturally freak out about the structure of the NTFS partition in /dev/sda1. Chkdsk did what it had to do, and the system booted normally. I then set it for redeployment.

I figure if anyone else has this issue, this blog post might be helpful. If it helped you out, and you’re willing, maybe dropping a wee tip in Bitcoin or Dogecoin would definitely be appreciated.