WIL WHEATON dot TUMBLR, So any journalist passing through London’s Heathrow has now been warned: do not take any documents with you. Britain is now a police state when it comes to journalists, just like Russia is.

WIL WHEATON dot TUMBLR, So any journalist passing through London’s Heathrow has now been warned: do not take any documents with you. Britain is now a police state when it comes to journalists, just like Russia is..

This post by Wil Wheaton is a really great reminder that when you are traveling, and I wouldn’t necessarily just put this as international to Britain but even when visiting the next town or crossing state lines even. Rights are being trampled everywhere you go, wether it be from a out-of-control cop, a bloodthirsty Sheriffs deputy or even a sticky-fingered TSA agent there is no lack of potential thugs, enemies, and thieves in your midst.

There are ways to secure your data and keep it handy as well. Store everything in an encrypted disk image or TrueCrypt archive on a cloud service like Dropbox or Google Drive and duplicate the same things in your memory sticks. If the thugs take your devices then you can rest assured that all you lost was the material itself, but no content.

I’m surprised that journalists and people who know journalists don’t all use GPG to secure their communications. I would think that if you were a whistleblower or had contact with a whistleblower that these little checkboxes would be foremost on your mind and already checked off.

You can’t trust any government, any cop, or any Vampire to keep their word. This goes for everyone as well, including your carrier and service providers. What should Verizon know? Shit. How about Dropbox? The same. Trust nobody and you’ll be safer than someone who trusted someone else. Trust is earned and right now, very very few people have it.

Encrypt Everything

Lavabit and Silent Circle have given up when it comes to providing encrypted email communications. Mega plans on providing something to cover the gap and in general the only real way to deal with privacy-in-email is end-to-end encryption. There was talk that at some point email might give way to writing letters and using the US Postal Service but there as well you’ve got Postmasters writing commands taped to mail about how everything has to be photocopied and stored – so even the US Postal Service is full of spies, the only thing the US Postal Service can be trusted to carry is junk mail.

What is the answer? Pretty Good Privacy. PGP, or rather, the non-Symantec version of it which is the GNU one, the GPG. If you really want to keep what you write private when you send it to someone else, the only way to do that is for everyone to have GPG installed on their email system so you can write email using their public key, which converts your email to cyphertext, secure from even the NSA’s prying eyes, and requires your recipient to unlock the message using their secret key, which they have.

I’ve been playing with PGP and GPG now for a very long time and I decided I would at least make a route available if anyone wanted to contact me with privacy intact – my public keys are on my blog, they are also on all the keyservers including the one hosted and run by MIT and the GPG Keyserver as well. To send me a private message via email all you need to do is get GPG, set it up, create your secret and public key, get my public key, use it to write me an email and only I’ll be able to read it. The NSA will just flag the encrypted contents for later analysis and thanks to AES–256, they’ll be hard pressed to get to the plaintext in your message.

That’s the way around all of this. GPG for everything. GPG public keys for email, for chat, for VPN, for files, and HTTP-in-GPG. Everything pumped through GPG. Since the government won’t stop spying on us, it’s our duty as citizens to secure our own effects against illegal search and siezure, and technology exists to do so.

Encrypt everything.

Private Dancer

Several days ago, while pondering an issue we’ve had at work an epiphany struck me. The problem we ran into was that our local network is a box of question marks. We don’t really know how it’s assembled or really what the rules are for using it, we just plug cables into wall jacks and if things work, they work. Until they don’t.

Enter NetInstall and NetRestore. These are the two imaging technologies for Macintosh and I’ve assigned my coworker to explore and develop images. Frankly he self-started it and I encouraged his exploration. We tried it first and both actions use a lot of bandwidth on the network and we eventually ran into a lot of problems. Not only did the machine we were working on take forever but it bogged down the server and caused huge headaches for everyone. We came to the conclusion that our local network just isn’t designed to carry any payload of appreciable size. It’s not really a complaint, but more of a characterization. It’s kind of fragile and wimpy.

So, was there a way we could still use ethernet technology without having to depend on our “provided” fragile and weak network? I sat in my chair pondering all of it, knocking some options out of the park instantly because of the machines we have. We can’t really depend on IP-over-Firewire as we have plain-jane MacBooks in the mix, they don’t have FireWire ports, just ethernet ones. As I looked across the way at all the server technology I had in the rack it struck me, each one, including the lowly Drobo had two Ethernet ports. Huh. Two. Only one was really being used to connect each machine to the network so each one had an available secondary port available. I then started to root around in my junk bin and found an old unused Netgear ethernet switch, five ports model, no fuss, no muss. I then grabbed a gaggle of short ethernet cables and started hooking all my servers and such to this little spare switch. Everything worked out magnificently well. In each server I configured these ports to conform to 192.168.0.* and assigned manual IP addresses for each of them. Then I found a unused Apple Express Wifi Access Point, plugged it in, set it for bridge mode and now I can extend this custom network into Wifi using 802.11N which is nice and fast. Just like that, cake and eat it too! What’s great about this setup is that my coworker and I can move large batches of data all over between these machines without having to worry about clogging up the network for all the other users who are trying to use these servers for their real work. Their files are small and their use sporadic, our use is large and nearly (sometimes) constant. The parts are just a few more blinking lights in the rack and just a little bit more spaghetti wiring hither and yon, but I don’t care, it works and it was free with the parts I already had on hand. The only part of all of this that upsets me is that I didn’t think to do it sooner. I suppose I should take some solace that it’s better late than never. Having this private access to all the systems makes both of our lives much better. We don’t have to complain to central networking anymore because we’ve abandoned their fragile wimpy thing for a far better solution in-house, and because it’s unroutable, we didn’t break one single rule, mind we don’t know what the rules are, but still. 🙂

It’s a good Friday.

Tag Painting in Day One Journal

I’ve been really enjoying DayOne and they have recently updated their app so that the iPad, iPhone and Mac Apps can all create and manage tags. What’s been missing is a way to blast in tags based on keywords.

In this example, every time I have a journal entry with “Scott” in it, I want it to be tagged “Scott”.

Here’s how I did it:

1) Open Terminal, go to ~/Dropbox/Apps/Day One/Journal.dayone/entries

2) FOR FILES THAT DON’T HAVE TAGS, A TAG SECTION, PAINT THE SECTIONS WITH THE TAG

find . -print0 |xargs -0 grep -L “<string>Scott</string>”|xargs grep -l “Scott” |xargs grep -l “<key>Tags</key>”|xargs -I file /usr/libexec/PlistBuddy -c “add Tags:Key string ‘Scott'” file

3) FOR FILES THAT DON’T HAVE TAGS, DON’T HAVE A TAG SECTION, CREATE TAG SECTION

find . -print0 |xargs -0 grep -L “<string>Scott</string>”|xargs grep -l “Scott” |xargs grep -L “<key>Tags</key>”|xargs -I file /usr/libexec/PlistBuddy -c “add Tags array” file

4) Then go back to #2 and re-run it. Everything that has your text should be tagged with the text you choose.

Swanky! The only thing you have to watch out for here is the little l (little ell) looks a lot like a capital I (capital-eye) – might be best to copy this into a browser and set the font to Courier just to make sure before you run it, also, the last xargs does the changes, so skipping out on that might be smart. I can’t make any guarantees that it’ll work, but as far as I can tell, it works great!

YMMV. Careful.

Slogger

Memories.Slogger

Every once in a while I run across something I’ve seen before but ignored accidentally until I see it in great big headlines and neon and stop to pay attention to it and discover that it does something I really really want. This particular afternoon it was the product Slogger from Brett Terpstra. The software is a Ruby script, and Ruby is a delightful programming language that I’ve had the pleasure of dabbling in. Nowhere near the level of Brett and the people who help him, but here and there, little things.

The need came from a simple Google query, IFTTT and Day One. Looking for some way to bridge that divide between the automatic web service that I’ve fallen in love with, IFTTT and Day One, the journaling software that works quite well and renders DropBox a “Killer App”. Dropbox is the glue that keeps my Day One system together, on my laptop, my desktop, and all my mobile devices. When I found Slogger it was a definite Eureka moment, the answer all in one place. I downloaded the code as the author describes and tried to set it up.

Monumental fail. Pieces everywhere, error codes puking on the screen faster than I could read, pages and pages of interpreter and compiler errors, all surrounding one “Ruby Gem” module called hpricot. I knew why this was fail-town for me, it was because I had installed XCode CLI tools in order to get the mac_google_authenticator PAM module built. That CLI package rendered my system retarded when it came to processing gem requests. In the Ruby world there is a system established for distributing software written in Ruby, it’s called ‘gem’ and you run it much like apt-get in Ubuntu, it’s really quite straightforward and never has given me fits – until. Everything was complicated by the fact that I couldn’t really find where XCode was on my machine, all the likely targets to search didn’t have anything relevant and my find command returned pages of errors and I didn’t have the patience to pick through a thousand lines of “Permission Denied” to find the one spot where the file was hidden.

Didn’t need to complain, as I knew the solution. Download XCode for real. So off to Apple, download the monster and install it. That satisfied hpricot, and everything else installed quite nicely. I set Slogger up, pointed it at my Dropbox and configured the plugins that I wanted. The initial run crashed and burned but I figured out why, it was an errant space in the line that points to the Day One folder, a symbolic link fixed that and I was again off to the races. Of all the plugins that I configured these were successful:

  1. BlogLogger
  2. facebookifttt
  3. goodreadslogger
  4. lastfmlogger
  5. pocketlogger
  6. rsslogger

Then there were the plugins I tried to configure but couldn’t:

  1. fitbit
  2. flickrlogger
  3. getgluelogger

The primary problem with the fitbit plugin was that fitgem, the Ruby assistant program that you have to install is a phantom. You install it, it’s successful, and then it’s gone. No trace of it exists. You try again, poof, nowhere. Plus for the plugin setup there are API codes, User codes, and oAuth codes. I get the reasoning behind all of them and getting most of them was not an issue. I felt a little awkward creating an “Application” for just myself, it seems kind of a waste of effort to ferret all these bits and peices into a semiformal request procedure, but doing it wasn’t hard or cost anything, so what the hell. The part where it all falls apart for fitbit is where you have to get the oAuth token, since fitgem never worked and it’s invocation from slogger should have opened a web browser and asked for my approval, all of that never happened. I tried to be sporting and do the heavy lifting myself but all I did was irritate the API for fitbit and I figured, what the hell, I got most of what I was after and moved the fitbit plugin into the “unused” folder and forgot all about it. Abandon ship, y’arrrr!

Flickr is a pain in the ass. It’s Yahoo and as such, it’s kind of an Internet leper. You need your Flickr number, there’s a site that makes that easy, except it doesn’t work. Flickr username? Feh, either the one in Flickr or your linked Yahoo ID is meaningless. I half figured it was in the URL anyways, but then I thought about it and I don’t really use Flickr all that much beyond a solitary IFTTT rule and that’s precarious as it is. The only attractive part of Flickr is they gave out 1TB of storage. Still lepers tho. So, abandon ship! Y’arrrrr!

GetGlue was the last great effort. Much like Klout, it’s a site that makes sense sort of, but the name is utterly silly. GetGlue. What the hell? Why? Glue has nothing to do with TV or Movies. The only connection I could think of was celluloid and horses-processed-into-glue sort of connection. They give away stickers, what a wonderful bit of pollution that is, and as a gimmick seems dumb. The plugin needs an RSS feed for the GetGlue Activity Stream. It appears as though the GetGlue folks have moved away from RSS and towards “widgets” which seems stupid as in this application RSS is the answer and widgets are worthless. Alas, Google searching for the RSS feed method was fruitless. I was half hoping for something like http://getglue.com/user/bluedepth/feed.rss, where I could just craft it up and be on my merry way. No. You have to “View Source” to find it, which is stupid because that is so full of CSS flotsam and jetsam as to be utterly incomprehensible. Again, my ardor for that particular service was fog on a hot day. I don’t need it. I don’t use it. Whatever! Abandon ship! y’arrrrr!

So I tried the slogger script, it failed, tore out fitbit goop and then it worked. Then I went into my Day One app and mopped up all the mess that testing had made. The only oddity I noticed was the BlogLogger completely missed out on the text on my WordPress site that was between pre tags. Meh. Not really a reason to kick the entire thing to the curb, just something to honestly stop using. HTML is a right bastard, almost all of the time. CSS is a filthy abomination, but we won’t go there.

I would say that tonight everything will work as it should for Slogger, but I have to race to work tonight to turn everything off because work is going to exit-stage-left when it comes to the Internet. They are turning the entire thing off, at least for a few hours. I can’t wait for tomorrow, there will be lulz.

So, to Mr. Terpstra, thank you for slogger. I’m sorry the plugins didn’t work, that fitgem was a phantom, but at least most of what I wanted worked. So we sound a victory cheer, sort of. Yaaay!

Google Authenticator

Dial lockOver the long Fourth of July holiday weekend I received an email from WordPress.com detailing news that they were now fully compatible with the Google Authenticator Two-Factor security system. I haven’t thought of Two-Factor in a long while and decided to look into how Google had cornered the market in this particular security market.

First a little background. The term Two-Factor security means that when you want to prove who you are to some service, called authentication, you usually just have to present two pieces of information, a username and a password. This combination not only identifies who you are and proves your identity through the shared secret of the password, but allows systems to remain as open as possible to all clients who want to connect – assuming that everyone is playing by the rules and nobody is trying to be sneaky or clever. Passwords are notoriously wimpy things, most people give up on complexity because they can’t readily remember the password and it’s not convenient so they select simple passwords like “12345”, “password”, or “secret” and leave it at that. The problem with passwords is that people who make them up are either lazy or don’t care about entropy or complexity and since a lot of your work and identity is being controlled using these systems, using these simple passwords is begging for disaster. Another issue that plagues a lot of people, and goes in with how naturally lazy many of us are, is that people will use one poor password on every site they go to and keep their usernames the same as well. The risk here is that when one service is compromised, all the other services are compromised as well and it’s a huge upward climb to get out of that mess if you find yourself trapped in it.

Cleverness works both against people in general, with thieves, phishers, and hackers as well as for people in general, with things like hashapass or applications like 1Password. Hashapass is a free service that combines the web address of a service with one single complicated password to generate a hash, which is to say, a value that is easily calculated from the combination of the single complicated password and the web address but done so in a way that going backwards is very difficult to do. If any piece of the puzzle is missing, it’s technically unsolvable. As an alternative to this there is 1Password, an application that I have become very fond of, and it uses a similar approach to hashapass. In 1Password one master password unlocks a database of all the sites and their individual passwords so you don’t have to remember a constellation of passwords, all you need is to remember one very good secure password and you are all set. There are a few other nice features to 1Password that I like, being able to generate very long random passwords and store them for me allows me to establish plausible deniability when it comes to my online identities. Because 1Password randomly selected a 32-character password for Facebook, I cannot be compelled, even under torture to reveal that password to anyone else. I just don’t know it. I know 1Password, but that’s not the right question so my account remains secure.

All of this I have collected and use, and I use it everywhere. On my MacBook Pro, my iMac at work, my iPad and my iPhone. 1Password makes it very easy to manage the security database and I’m quite sure that it’s secure. In my life, any more security is rather like putting more padlocks on a firmly locked jail cell, it’s rather silly and feels a lot like overkill. Then again, more security is always better, especially if it’s really clever and somewhat convenient.

Two-Factor security adds another component to the process of authentication. It augments the username and password combination. A password is something I know (or store using 1Password) and the second factor is something called a Time-Based One Time Password (TOTP). This is where the free iPhone app called Google Authenticator comes in. The app records a secret key from a site I wish to prove my identity to in the future, for example, Google itself. I set up two-factor, request a security token for Google Authenticator and set it up in the app. The key is transmitted by QR code, which means you can quickly acquire the long complicated random (hard to type) secret key using the camera in your phone. Once this process is complete the Google Authenticator app displays a six digit number that will work to prove your identity to the site associated with that particular entry and this entry only exists for 30 seconds at a time. This six digit password exists only once in any one 30-second period and there is no way to divine this password without having the Google Authenticator application with it’s stored secret code.

Having two-factor enabled in this way means that my username and password are no longer as important as they once were. Even if my username and password are revealed or compromised without my knowledge, the secret key that I have in my Google Authenticator app remains secure with me and the 30-second-long one-time-password additions remain a secret with me. What I know may be compromised, but what I have (the Google Authenticator) most likely won’t be unless someone steals my phone and finds a way to best the security on that device before I have a chance to wipe it remotely. If in the case my Google Authenticator becomes compromised, my passwords will likely not be because they are uncrackable, and so I am still secure.

Practically how does this work? When I want to log into Google Mail using two-factor, this is what I do. I open a web browser, I type in the address “gmail.com” and press enter. Then I enter my username and my password and then in the third field under the password is a box labeled “Google Authenticator Token” and then I grab my phone, start my Google Authenticator application and then read the six-digit number from my phone and type it in. The service logs me right on and after a few seconds, that six-digit password is no longer valid and is meaningless. I’m authenticated and the system did as it was designed to do. One of the nice parts of Google Authenticator is that the entire app is a mathematical operation, it doesn’t require the network at all to generate these numbers, so this would be a good solution for people who may not have a reliable connection to the network or have a data quota on their phone.

Of course, online authentication is just the beginning. I found a way, yesterday, to embed the Google Authenticator system into my Mac OSX Mountain Lion installation so that when I want to login to my computer at work or my laptop I have to type in my username, my password, and read the six-digit code from my Google Authenticator application. The setup isn’t difficult to get it to work. You need a compiled PAM module which I have (just ask if you want a copy) and an application which you use to create the secret key on your computer. With it all set up, and a slight adjustment to a settings file, even if I were to lose security on my password at work nobody could login to my account without my username, password, and GA token.

This arrangement works quite well and I’ve set it up for my Google accounts, my WordPress.com and .org blogs, Facebook, Evernote, and Dropbox accounts as well. Everything is secure, obnoxiously secure. 🙂

photo by: MoneyBlogNewz

Doo

As I was looking through my Pocket saved items I noticed an article from MacSparky in regards to a new service called Doo. It’s supposed to index all your documents and tag them so you can find them faster than with Spotlight. So, I downloaded the app and set it up. It pegged my MBP for hours and hours with fan-churning processing and made the entire laptop unusable. Then I noticed it had tried to index pictures, something I wish there was a way to turn off, but I couldn’t find it. I then noticed a section in Preferences and a place to define rules. But I didn’t know what some of the rules meant so I went online to search for documentation. All I could find was a hopelessly poor support site and absolutely no documentation whatsoever. So, that pretty much does that. Also they hand out 25GB of storage, but it’s unclear what they are storing. The index or copies of the files? Anyways, it was a mistake and I emptied the app, revoked all the links to my other cloud services and requested the company dispose of my account. I won’t ever be back. What a mess.

If you see Doo.net advertised, just skip it.

G-RAID, Time Machine, and Spotlight Headache

A few days ago, of course right before the “Money Back Guarantee” expired on our G-RAID 8TB Time Machine drive at work both my S3 and I were battling with a rather nasty pernicious bug that was plaguing this device on our new fancy Mac Pro Server running OSX Mountain Lion.

The problem was this, you plug the drive in, using Firewire 800 and Time Machine sees it and starts backing up files. That works just fine. After say 1TB of files get backed up Time Machine works gamely for about three or four hours and then the drive suddenly goes deaf. What I mean is that the drive is still connected, the icon is on the Desktop, but you can’t do anything with it. It gives you a fusillade of meaningless errors, vague ones like “Unspecified error with file system” and the like and Time Machine is stuck and can’t do anything at all with the drive. It’s not really a headache for us currently because the server is brand-spanking-new, but still, it’s a concern for us. You have to eject the drive, and not a plain eject either, but a Force Eject. When you move it to another computer and plug it in and do a fsck on the drive everything pans out fine. Everything is hunky-dory, journal is fine, structures are peachy, the works. So annoying.

So off to Google we go. Turns out there MIGHT be a bug in the “Turn off Hard Drives when possible” in the Energy Saver preference pane in System Preferences. This strikes me as a wee bit of bullshit, the drive should go to sleep and wake up elegantly like anything connected to a Mac should (and almost always does!) so, fine, turn that off. Testing. Ah, failed. So next stop was to try to irritate the drive with constant actions. To that end I created a script:

!/bin/bash

while true
do
touch /Volumes/G-RAID/keepalive
sleep 60
done

So what this script does is touch, which is a Unix command in the Mac that just runs out and accesses a file, it’s size is zero, it just runs the most basic of file operation on a drive. If you touch a file on a sleeping drive, it should wake it up. If the drive is counting down until it goes to sleep, this operation will reset that counter. Then the entire thing takes a nap for a minute and does it again, and it does it over and over forever.

We tried that, and still ended up with a failed Time Machine backup and a drive that’s gone deaf. The exact error you get in Time Machine is “com.apple.backupd: Error: (22) setxattr for key:com.apple.backupd.HostUUID … ” So, still no solution to our problems. We finally figured out what the silver bullet was, and it came from an unexpected source. We added the G-RAID drive to the Privacy pane of Spotlight in the System Preferences on the server and voilà! Magical solution!

Since I did that, the drive has been working happily since I made the change, it’s been about a week. My working theory is that mds (which runs the Spotlight service) either locks a file or does something sneaky with this extended attribute on the HostUUID object and that, somehow, ruins access for the entire file system on that drive. It’s not that the file system is damaged, it’s just not working.

So, where’s the bug? Is it in mdsworker, mds itself, backupd (Time Machine), Firewire 800, the Firewire 800 cable, or the G-RAID drive? The answer is a definitive YES. Somewhere. Something is causing it and the only solution seems to keep mds’s muddy hands to itself and pester the drive every minute with a meaningless file operation via touch.

The upside is the damn thing works, so we’ll keep going with it until it stops working. I wish there was something clearer than this Error 22 from backupd to go on, but alas, this seems to be a valid workaround and frankly I don’t really need Spotlight to go futzing about on the drive anyways. There won’t be any searching done on it anyhow, just the indexing that Time Machine needs and that’s it.

I guess in the end, all’s well that ends well.

Byword 2.0

Byword, one of my favorite apps for the Mac and for my iOS devices just upgraded to version 2.0. They have included publishing to blog platforms as a Premium feature and used the Mac App Store or iOS to distribute the added functionality for $4.99. So far I love this app and this was one of those features that I’ve been dying for, so I’m quite pleased. I can do all my writing using Byword and not have to worry about distractions or anything on the screen getting in the way of my writing. It’s all clear, clean, and simple.

The last post to my WordPress blog about Invention was written using Byword 2.0 and I’m quite impressed with it. I could suggest some other enhancements like enumerating the Category list and suggesting possible tags in WordPress posting, but I will take what I can get from the get-go. One thing that was a little dismaying, but not a show stopper was that the purchase of the Premium add-on only works for the App Store that matches the platform you are buying it for. The Premium add-on for Mac App Store is separate from the one for the iOS App Store. Their support was very clear and I pretty much assumed so even before I wrote to support, I just wanted to be sure. Frankly I could give or take the extra features on my iPad or my iPhone as Drafts works brilliantly there along with Poster app on those devices. Drafts hands off to Poster well enough without having to worry about buying Byword 2.0 Premium again for the iOS App Store. I bought the add-on for the Mac App Store because that’s where, when I blog on my Laptop or on my iMac, this will be the app that I’ll use to blog.

The only irking thing, and it’s not really anything really overwrought is the lack of pick lists and tag suggestions for WordPress, but I have faith that eventually they might take their software in that direction. Only time will tell, and developers. 🙂

Time To Die

Jolly RogerI’ve refined how I kill weeds around the house. I recently got a small jug of 25x RoundUp and I was wearing blue nitrile gloves and using a paint brush to paint leaves with the RoundUp. I was in the middle of killing some nasty “Ornamental Grass” which is actually a weed and it struck me that I have these gloves on and all I have to do is just stick my finger in the RoundUp solution and then I could reach out and touch the plants I want to die.

That sped things up immensely. I wandered through my property and anything that I evaluated “This has to die” got a loving RoundUp soaked stroke of death on its leaves. There is a nasty weed, amongst many other species on my property. It grows very tall and has slender long leaves and there’s a central cluster that always looks like it’s on the verge of blossoming but never does. This week chokes out all the other plants and so I selected it to die. There is also a nasty vine-weed running throughout the back yard in my garden and it’s wrapped itself up around all the trees and it’s again, choking out all the other plants. It has smallish maple-tree-shaped bronze leaves and it has as well, been selected for death. Finally, there is our enemy, cypress spurge. I have found clumps of it all over the property and I have caressed it to death where I was sure I wouldn’t touch any other plants.

When my mosey of death is finished I cap the RoundUp and peel off the nitrile gloves. Then I go inside and wash my hands vigorously in hot soapy water just to make sure that there isn’t any RoundUp on my hands that somehow made it through the nitrile.

In a few days to a month we should see systemic shutdown of growth and subsequent death in all the selected-to-die plants that are trying to grow on my property. It kind of made me feel like a grim reaper. Walking along, just tousling the tops of noxious plants with my hands and softly whispering “time to die” as I moved along.

It certainly beats stooping, trying to pull the damn things up and not getting the whole plant or watching it just pop up somewhere else because a runner or rhizome decided it would try life as a pioneer. This way the entire plant, it’s leaves, stalk, and root systems all die.

photo by: Timothy Tolle