Mars One Park

Posted: August 28, 2012 in fiction, mars

Neil sat on the park bench, chin and hands resting on the top of his cane. His sparse white hair waved in the breeze, keeping time with the arctic grass planted around the bench, the statues, and the dark red stone walking path through the park.

This was Mars One Park. “Built to honor the first successful human exploration of the red planet and inspire children and adults alike”, they said some time ago. They had invited him to come for the commemoration. He came, but did not speak. It was too painful.

And yet Neil was drawn back to it not long afterward and discovered that there was healing here. So he made it here regularly for the last year. He came sometimes specifically to remember the early times. He came sometimes to forget the later times. This was the place where both happened.

The arctic grass and the scrub trees, the deep red paving bricks for the path and the statues of the team, the other monuments and museum pieces of those early days mattered little to his memories. His eyes that were once so blue could still see the barren, pale marscape that greeted them on their landing and those first sols before they began shaping the planet to fit human activity. And, so, being here where it started, seeing with his mind’s eye the barren place of his past overlaid  with the richness of the present park, it allowed him to follow the trails of memories where he wanted.

The people sometimes distracted. Today appeared to be one of those days because the crowd was great and the buzz of conversation was strong in the air. Not packed, just busy with lots of walking and talking family groups. Perhaps a holiday. Neil considered the crowd, the length of the sol, and thought that it might be the one holiday he should not have come. It might be Mars Explorer day, the day to honor their landing. Yes, the more he considered it the more likely it seemed.

The crowds tended to move fast like a river and weave around him whether he was walking or sitting on his bench. They left him alone in his bubble of memories as if he had passed beyond their time. It wasn’t the crowd that was bothering him. It was one person. The young blonde man standing across the square was staring at Neil thoughtfully. He obviously already knew or was on the verge of recognizing Neil.

It appeared that he did. He was now walking purposely across the grass and the sidewalks directly to Neil.

He stopped a respectful distance in front and waited until Neil gave him his attention. Neil relaxed, thinking that this would be a more graceful encounter that he could more easily bear.

“Dr. Fellowes,” the newcomer began, “You are Dr. Neil Fellowes of the Mars One team, correct?”

He paused, respectfully waiting an answer from Neil as permission to continue. A man with a family walking passed jerked his head and slowed to look at them both.

Neil acknowledged he was with a nod and the young blonde man continued, “Forgive me for interrupting you at this beautiful park which must bring you such joy, and sadness. I really admire you, and the whole team, and wish they were still here with us as I am sure you do. May I sit with you? I would love to talk with you for a moment, or, for as long as you are willing.”

Neil shifted on the bench, leaned back, and gestured for the man to sit beside him. “I can’t promise a good conversation, but please, sit.”

The young man sat and both turned so that they were looking more towards each other. He proffered his hand to Neil and introduced himself, “I am Kurt Persson. I am a first generation Mars-son. I am in my second year of university and my heart still loves Mars history, I eat the classes and books up. Both my father and mother were early settlers from Sweden when they were young, and they had the honor of working with you for a time, on wind management. It was those stories that gave me my love of Mars history and enabled me to recognize you.”

Kurt’s voice took on a slight Swedish accent as began talking about his parents. Neil put two fingers on his temple as he concentrated on sifting through his memories. He smiled, “Ah, was it Harald and Kerstin Persson?”

“Yes!” Kurt said with a huge smile and obvious excitement.

Neil nodded, “Very brilliant engineers, individually, but as a team, phenomenal. They saved time, resources, and lives too, I know.”

Neil reached over and grabbed Kurt’s hand and squeezed it. “That time of life was one of my greatest times, oh the landing, the setup, that was exhilerating. But the things the teams like those with your parents did were remarkable. Are they still living?”

“Very much so,” Kurt answered, “Though I fear they are on the far side of the planet and I don’t get over to them often enough. They are working on a new, higher dome structure for larger cities, to house more above ground, but to change the planet less destructively in doing so. I think that was something you helped them see.”

They paused. Neil’s perspective was shifting and he was looking through the present to the barren past. The crowd disappeared out of his peripheral vision and the buzz of their conversation melted away. It was him and the beginning. And he just started sharing.

“This crater was absolutely beautiful in it’s barren starkness when we touched down. The stone and regolith had been practically untouched forever. A rover half a century before we arrived, and some deliveries of machinery and raw supplies in the few years leading up to our arrival. One over there,” he said pointing off his right shoulder, “around the southern ridgeline, another on the north, and one right just west of our touch down spot where we are sitting.

“The planet was raw, powerful hostility to life, but  it was at the same time raw loneliness, calling for me, for all of us to make it home.” He paused and the loneliness was tangible despite the crowd pressing them a moving around their little bubble of the park. The man and his family stood there facing them both and yet neither Neil nor Kurt noticed. The father leaned in and whispered to his wife and their children and gently beckoned another man and his family to come over.

“When we landed I was looking out the porthole window just this direction.” He motioned with arm in a straight line in the direction the bench was facing. “I saw the ridge where we felt was the most likely place to tunnel down for living quarters. This crater, Mars, it was going to be our home. It called us, it called all of us. But it didn’t want us yet. Not yet.”

Neil paused lost in memories. As Kurt looked slightly away from Neil he noticed the people around them. The large milling crowd had changed. The buzzing, deafening conversation has dropped a degree in their vicinity. People were stopped and listening. Just a few close by but it was spreading as more and more as he saw people whisper, point, and then squeeze in close.

“The sixteen of us were itching to get out of the lander immediately, ” he began again. “But we forced ourselves to take the proper time to run through the safety checks on our suits, the pressure locks, the radios. Everything went by the book not just once, not even just twice. We were determined to have no mistakes. That was our mantra and we stuck with it. This was our life and we could not go back if we did not succeed.”

He looked up to point again, straight ahead to the drill site and now entrance to the museum and noticed the dozen people gathered around listening to his story.

“Oh, hello,” Neil mumbled, momentarily distracted.

“Please, Dr. Fellowes”, a dark skinned man, the father of the family said, “Please continue if you don’t mind us listening too.”

Neil nodded, and the man promptly sat on the grass at his feet, gesturing his family and the others to sit also. They sat and squeezed in, surrounding the bench, cameras and devices came out for pictures and recordings. Neil glanced back to Kurt and tried to pick up where he left off.

“We ran the safety checks over and over, taking our time for two whole Sols. I even had time to connect up the wireless controllers to the equipment,” he motioned towards the south drop he had pointed out earlier. “And programmed them to move and meet us at the drill point there,” he finished pointing straight ahead over the sitting crowd.

“We finally began debarking on the third Sol, two by two, as it happened, like Noah’s animals. Nobody said anything about that the time, it’s just the limit of what we could squeeze in the airlock fully suited. I think it was Rachel who first made the joke about us being Noah’s pets but it was much later.

“I wasn’t the first one out, it was not Neil on the moon and Neil on Mars, though I jockeyed hard for that distinction — Neil Armstrong and Neil Fellowes, the first men on our first expeditions setting feet down off earth,” Neil was smiling big remembering it all. “I was second wave with Robbie. It was Rachel and Anton, me and Robbie. We were coming out as fast as we could. Nobody was saying anything. We just got out and moved off enough to make room for the others and stood here,” Neil gestured around them, “in the red silent wasteland, absorbed by the silence. Silence out of respect in part. But silence out of mostly shock I think.

“I didn’t even break the silence. I wanted too, so badly I wanted to. I had dreamed of it all my life, of mimicking the steps and words of my hero Neil Armstrong but on Mars. And here,” Neil stomped his foot, “and here I was.

“But we were still earthlings then. Earthlings in shock at being on Mars. We had all been completely enraptured as we came out the airlock and saw the beautiful pale red, barren view, and two moons! Two moons– Phobos,” Neil raised his left hand pointing the northwest, “was moving fast, coming over that ridgeline. And Deimos just hanging high up over there…” he raised his right hand to the north east.

“We were just silent and soaking it in. I think I know why- Neil, Neil had the blue marble called Earth that was home right there,” he said pointing to a blank sky.

“It was his anchor, keeping him focused. This,” he gestured around him, “this red land, was not ready for humans and yet it was our home. We were alone. We were earthlings. On the moon there was a blue and green planet to call home. Earthlings standing on Mars had almost nothing but the suits on our backs.

“So we kept coming out of the lander– eight and still no words had been spoken. I knew the moment was here and I thought perhaps I was destined for it after all. I tried to gather my thoughts and focus. I had written a dozen sayings that I thought passable though I never thought they compared to Armstrong’s ‘One small step’. I had prepared for it– and in fact we all did, we admitted later. But no one spoke. No one could make thoughts into words.”

Neil paused, a smile tugging at the corners of his mouth.

“Nine and ten were just coming out, Miranda and Sophie. All of sudden, Robbie… Robbie just opened his mouth to say precisely the wrong thing as he had knack for. At this singular event, an auspicious event, the most auspicious event in a century, an event that required a spectacular saying. And Robbie just spoke. Words said that cannot be unsaid.”

Neil leaned back, closing his eyes, the slight smile growing larger. “Still, it was pretty funny, later. We were shocked at first. Disbelief. Then angry… He killed our history making moment, never to be repeated. The death of a moment… And it was like we went through many of the stages of grief all in an instant for that moment we had been waiting for. Oh, we did finish with laughter, laughs for a lifetime at his expense…”

Neil leaned forward again, resting his chin on his hands on the cane again. His eyes were sparkling now, not the bright blue of his youth but the wise, aged grey eyes of lifetime exploring and seeing new things. The crowd was nearly thirty strong and enraptured, silently experiencing Neil’s story. Neil glanced around the crowd and spied the children from the first family to stop and motioned at them.

“Do you know what Robbie said?” he asked them. “We tried to change the recordings and to get something more magnificent for posterity but I’m afraid it was too late. Tell me, do you know what he said?” Neil asked the boy who looked to be about 10.

The boy looked shy, frozen in place. Neil smiled and gestured to the girl next to him, slightly younger he thought. “Or you?”

The big eyed, raven haired girl smiled shyly and nodded that she did. “Go ahead, tell us” Neil encouraged her.

She blushed, but loudly and proudly said, “Dang! I forgot to pee!”

Advertisements

The night was sultry

Posted: August 26, 2012 in fiction
Tags: , ,

Neil Fellows felt pain in his hands and forearms as he pushed the outer Armstrong Gate closed. He had stayed out too long and even the four layers of insulation, shielding, and life support paraphernalia of their suits could not protect for the length of time he loved to stay outside the bunker.

The gate connected, latched, and sealed off the Martian winter with a whimper of a sound. The atmosphere was too thin to make as much noise as one expected. But the physical jarring was not hampered by lack of atmosphere and the impact sent searing jolts of pain up his arms and seemingly directly into his eyes.

Neil paused just long enough to let the pain subside before proceeding down the slope towards the next gate, the garage, and eventually the underground bunker they called “home”. This entrance was the first entrance made into their bunker and was carved wide enough to bring the vehicles inside for safety and maintenance. Neil was the only one on the team who still used it for non-vehicular excursions because it was more challenging to use. But he couldn’t help it. He loved the name. Armstrong Gate. It was strong, powerful, and so emotive. Especially for Neil since it was a connection to his namesake, Neil Armstrong, the first man to step on another world. And here Neil Fellows was following in his footsteps on Mars.

He followed the driveway down into the dimly light tunnel to the next gate already warming up with the activity now that he was out of the direct cold. This large chamber was still cold since it was neither heated nor pressurized but it was a good deal better than the surface right now. Neil operated the man-sized pressure chamber door when he reached the end of the drive. The second door and chamber was large enough for some of their vehicles but required more resources. Resources that were very, very precious since they maintained life on a planet that would take their lives in an instant.

The effort and the impact of the door caused less pain this time. With a hum and a swoosh the atmosphere returned to the chamber and Neil smiled like always with the return of definite sounds instead of faint echoes of sounds that one experienced outside in the Martian atmosphere. The sounds were even more completed as he opened his mask and turned off the breathing machine, closing his eyes and again smiling as another sense returned: the smell of Mars.

When Neil was outside he never knew he was missing these senses. There was too much glory in being on Mars, too much data to collect with his eyes that he never noticed that nose and ears were removed from him. There was always that transition period of suiting up and waiting to step out. But that all energetic anticipation of stepping out onto the surface again that Neil didn’t mind the short-lived muteness and lack of olfactory sensations of the suit.

The smell of Mars. Here in the garage behind the two doors and the airlock of the Armstrong Gate it was the strongest. The garage and the whole bunker was dug underground to protect them from the solar radiation. Earth had a massive atmosphere to filter and protect but Mars was thin and exposed so living underground was the best alternative. In the garage the walls and floor were pure bedrock and martian soil and the smell was uniquely Martian.

Neil walked passed their exploratory vehicles and various other machinery to the doors that lead into their living and common quarters. They were also carved out of rock and so the smell of Mars was there with them in this space where they lived, and worked, and played, and ate, and it soon became just a smell of people. Not quite Mars, but Martians. Something a little different.

Neil entered through the strong doors and into definitely warmer living quarters. He paused in the “mud room” and removed the layers of the suit and hanging them up, connecting them up to recharge, shedding thirty pounds in a few minutes. Still chilled, he left his balaclava pulled up over his ears and head and walked down the hall into the large common area passing by the humidifier that kept all their skins from cracking and bleeding. That was another thing he loved about staying outside so long. Neil found he really appreciated the comfort of their living space– the heat, the air, the moisture control. Little things that could be taken for granted until one was deprived of them.

A sudden very distinctive earthly smell hit him like a wall as stepped into the common room. He saw most of the group gathered in the corner watching the video screen, laughing, talking, and eating. It was warm, it was inviting.

“The night was wet and hot, hot and wet, wet and hot; that’s humid. The night was humid.”

Neil stopped at the back of the room smelling the air. This was something else besides the reintroduction of his deprived sense. What was that smell? It was an earthly smell? It had been so long.

“Hey, Neil- you’re late for movie night!” Anitoly shouted to him, waving him forward.

The others turned around, smiling and beckoning him forward. “We almost don’t have any popcorn left!”

Popcorn. Neil smiled as he removed his balaclava and started forward to the group. That was the earthly smell. Popcorn.

“The night was sultry” said Mrs. Lift.

This is my first attempt to roast a bean from Burundi, and the latest in several recent exploratory coffees from Africa. This is my two hundred and thirty-second roast in my RK drum and Fiesta grill.

My grill is setup with two burners, marked “Left B” and “Right B” in the graph. I keep the drum over the right burner for controllable direct heat and the use the left burner to boost the chamber temp. The burner measurement is a percentage value using 1,000 as full heat. The roasting chamber temps are Fahrenheit.

The goal is use the left burner to slowly raise the temp for the drying phase and then raise the right burner (the direct heat) for actual roasting. Once the beans are into first snap I like to bring all burners to the lowest setting to draw it out a bit and prevent it from rushing into second snap.

I took 1,080 samples with my cheap temp-probe, or a roast of 18 minutes.

The black markers at the bottom are markers for the following notes:

  • sample 852, 14:20, first snap is heard
  • sample 876, 14:45, snaps are in groups
  • immediately into slow roll
  • sample 895, 14:55, rolling strong/fast
  • sample 928, 15:30, smokey, still rolling
  • sample 967, 16:07, mostly done with first snap
  • sample 1000, 16:30, open lid for 05
  • sample 1067, 17:40, a second snap or two is heard
  • sample 1072, flame out
  • sample 1080, into the roast cooler

I can’t wait to see how it tastes!

Safari Preview Swamps SSO Server

Posted: May 17, 2012 in tech
Tags: , ,

Sometimes when I log in to my SSO console I would see a lot of sessions for a single user. I can understand three, four, even six, between restarting the browser and using multiple browsers. I do that a lot since I am responsible for the SSO system and have to check out the multiple ways of authentication. But the excessive sessions for a few users bother me. I mean excessive like 20 or 40. And for ME!

It’s not a big deal — the SSO server (CAMS by CafeSoft) could handle it. But I needed to know what was going on. What if this is something that needs to be fixed? A hole that needs to be plugged? Besides that, I try to save money where we can and sometimes we hit our licensing limit for development and integration/testing and I don’t want to have to buy more seats if we do not need them.

By process of elimination I was able to narrow it down to either one particular browser or the Kerberos authentication that it was configured to do. I had been watching the logfiles to see the requests that were made and redirected to the authentication links, the unique session id assigned to it… and then nothing. That sessionId was never used. There was no success message logged with that session ticket.

Now it was time to get sneaky and figure out what the browser was doing. I was already thinking that it was related to the browser pre-fetching things to try and be faster. So I killed my browser and had another admin remove all my session from the system. With Wireshark running and capturing everything for my server subnet I launched Safari again and followed my Usual Methods.

  1. Launch browser, type over the address bar with the address of my dev server
  2. Admin in other room constantly refreshes and reports no new sessions as I am typing the address. That shoots down my initial thoughts of pre-fetching.
  3. I hit enter and observe the usual redirects and invisible kerberos login
  4. My assistant from the other room reports I now have one session. Everything is perfectly normal
  5. I begin typing on the address bar for a slightly different space on same server but different Apache virtual host and eventually hit enter. My assistant reports still only one.
  6. Launch new tab– Eureka! Assistant reports I now have sixteen sessions.

So… Safari is configured to display the “Top Sites” thumbnails in a new tab. And those “Top Sites” are not built based on cache but with new requests. Whaaaa????
Safari Tabs setting

It gets worse.

Remember I had Wireshark capturing in the background while I was doing this? I examined the packets and was able to determine what was going on. I see Safari making the request and the server responding with the 302 HTTP code to be redirected to the kerberos login page.

GET /CamsConsole/sessions/Sessions.do HTTP/1.1
Host: *****.********.nasa.gov
X-Purpose: preview
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8)
AppleWebKit/534.55.3 (KHTML, like Gecko) Version/5.1.5 Safari/534.55.3
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us
Accept-Encoding: gzip, deflate
Connection: keep-alive

HTTP/1.1 302 Moved Temporarily
Server: Apache-Coyote/1.1
Location:
http://*****.*****.nasa.gov/kerberos/login?cams_security_domain=*******&c
ams_original_url=https%3A%2F%2F*****.******.nasa.gov%2FCamsConsole%2Fsessi
ons%2FSessions.do&cams_login_config=kerberos
Transfer-Encoding: chunked
Date: Thu, 17 May 2012 16:04:36 GMT
Set-Cookie: BigIPDensitySecure=604176650.37663.0000; path=/
Vary: Accept-Encoding, User-Agent

There are several interesting things to note in this request. First – there is a special HTTP Header being used by Safari, X-Purpose: Preview. Second, there is a very notable lack of other HTTP headers. In fact, you could say just the basics of compression, the Agent, and KeepAlive. That last one is important by the way at figuring out what is going on.

There is one important clue in the reply from the webserver and it is not really that obvious until you look at the later packets.

When you filter a conversation in Wireshark it makes a new filter for stream=xxx to show you everything in that conversation. With most HTTP servers this stream can be open for a 100 requests, five minutes, or something in between. It’s all negotiated between client and server. This is part of the function of the “KeepAlive” header, it is the client telling the web server, “Hey, I support keeping this session open if you do”. So it stays open for further requests.

And if I thought I was done looking at everything that my browser was doing I would miss some other details. Because this conversation keeps repeating. Several times. Exactly the same.

So I clear the Wireshark filter to restore all the conversations to see them as they happen by time. The very next stream is the request to get logged in via Kerberos. I’ll try to keep this long post short and summarize — the browser says give me “this”, and server says “401- please login with ‘Negotiate’, and here’s your BigIP cookie”, and browser says “hey, I have ticket, here you go”, and the server says “OK, here’s your SSO cookie and here’s your BigIP cookie”.

And then I see that again. And again. Several times over multiple streams just like the original stream. And meanwhile the Cams authentication service creates a new session for me. And again. And again. As long as it keeps doing it.

And that is when it hits me. The Safari X-Purpose Preview function does not utilize cookies. It seemingly does not accept, keep, track, or submit any cookies. It never sends the BigIP or SSO cookie.

It’s just going to fill up my logs with useless new sessions and deplete my available licenses.

I’ve been scripting a lot of fancy things into our F5 BigIP LTM-1500 lately and this seems like another perfect way to solve this little problem. A simple iRule applied to the virtual server instance can intercept the HTTP Request before it even goes to the web server.

This iRule goes up near the top of the other iRule items (before any ProxyPass iRules if you use those) and acts on HTTP Requests only. When it detects that Safari header it responds with a 200 and a short message and never ever goes to the web server, Cams, or passes Go.

Haiku Preview Message

Haiku Preview Message

Yes.. I did wax a little more poetic than absolutely necessary, but that white box in the picture above is my short and quick way to stop Safari Preview from depleting my development Cams licenses.
iRule script download

There’s a fun little geeky comic online that you may have heard of, XKCD.

A while back the author had comic that resonated with me about password security. I’m not buff enough in my math skills to keep up understand the equation but I could follow the principle. The idea is that four (or so) random words is more secure than making an extremely complex password that has numbers and special characters embedded and replacing letters. The challenge with this somewhat standard practice of l33tspeak is it has to be written down. But… since we are people that love stories, four random words in real English will be more memorable because we can make up a story to remember it.

Here’s the famous comic.

Recently, while I was working on one of my other Perl scripts, I was on an online forum and saw a post about how to make a random sentence. That tickled my fancy and I came up with a quick and dirty little CGI script to generate a random sentence suitable for passwords. Unfortunately a lot of places have requirements to still include special characters and numbers but this little script will meet those requirements. The spaces meet special character requirements (most of the time) and a number between 1 and 99 is included.

The results? They are often amusing and poetic. Sometimes they are risque. It just depends on what is in your system’s local dictionary. Download Perl Password Poetry Producer

#!/tools/perl/current/bin/perl
#

print "Content-type: text/html\n\nPassword Poetry Generator\n";

open (INWORDS,"< /usr/dict/words");
@w=;
close INWORDS;
chomp@w;

my $poem;
my $randiddly=int(rand(99));

if ($randiddly%2==1){ $poem= join" ",(map{$w[rand@w]}1),$randiddly,(map{$w[rand@w]}1); }
else { $poem= join" ",(map{$w[rand@w]}1..2),$randiddly; }

if ("$poem" !~/[A-Z]/){ $poem= join" ",$poem,( map{ ucfirst ($w[rand@w])}1);}
else { $poem= join" ",$poem, (map{$w[rand@w]}1) ; }

print "$poem\n

\n

\n";
#some html code has been stripped for wordpress

Some sample passwords:




Expirations happen.

But when those SSL certificates expire before being replaced, well, that’s bad. That’s egg on your face. This little Perl script is to put the egg back in the burrito.

All you have to do is make a directory tree where you save your public certificates (you don’t need the private key). Name them with a .cert extension if you use my code exactly or you can tweak the extension to match, and set up this little Perl script as a weekly cronjob to send you an email warning before they go bust!

You may need to add a few modules to your Perl repository. The modules I am using are Date::Calc, Crypt::OpenSSL::X509, Term::ANSIColor, and MIME::Lite. The Crypt Openssl module was a major pain in the butt to compile on Solaris. I should do a blog about that.

Oh, and the MIME::Lite module seems to require root or trusteduser privilege to run. At least on my Solaris boxes. It works great on Max OSX, but I’m probably a Trusted User on that system, I will be testing Linux before long. So, tweak the locations of the script in my examples below to meet your needs.

Setup the directory –
mkdir /home/billSpreston/mycerts

Copy the certs from your various servers, naming them with .cert extension —

ls mycerts
server1.cert server2.cert server3.cert

Touch a file for the Perl script and make it executable

touch ~/certwatch.pl
chmod +x ~/certwatch.pl

Now edit the file with your favorite editor (vim, or Smultron rocks!) and add this code in the certwatch.pl PDF. (code with HTML tags is very hard to add to a wordpress.com blog).

Be sure to run it a few times to make sure it works the way you want it. Debug or verbose mode is useful in this phase, as is playing with expiration time. You could also create certificate using openssl that expires next week to test, or find an old expired cert as well. And when you are satisfied create a cronjob to run it weekly on your schedule and get pretty HTML reports in your mailbox. Don’t forget to turn off debug or verbose mode unless you just like noise.

There’s a secret to being a good sysadmin: You have to be just a little lazy. Just enough that you can see a better way to doing boring, repetitive, tedious tasks and write a script to do it, letting you get back to more important tasks. This usually involves making a tool do something for you. And for a Unix admin, that means writing a quick little script.

A good Unix sysadmin isn’t content with just one medium for his scripts. He should be using shell and Perl so that he has both round and square pegs for all the different shaped holes that need to be plugged by a good script.

I was recently trying to import data about our backups into my TeamQuest reporting tool so that I could graph the usage and reliably plot trends. The backup administrator found a great command for pulling stats out the NetBackup database. The NetBackup command is found in the install directory bin/adminbin subdirectory. It takes a variety of options so be sure to read the man page. I found two basic commands I needed to give it to get the data I needed. One was for gathering live data, and one was for accessing historical data that I wanted to import for a really clear picture of things.

Going back the statement about sysadmins have a touch of laziness– I ask myself, why manually pull data when you can automate data collection?

After experimenting with TeamQuest and weekly and daily stats I finally determined that I really need to gather data hourly in order for some of the automated graph methods to be able to do their job. If the truth were known (and it’s about to be) I’d really prefer just to grab the data once a week so that I can look an entire backup spectrum of Full backups and all incrementals. But it is minor oversight by TeamQuest that the new-ish tools ITSAR (IT Service Analyzer and Reporter) cannot take a macroscopic view like single point of data for a week and graph it over a six month period. Minor oversight, I forgive them, and I’m sure it will be corrected sooner than later.

So here is my “live” data command to get an hour summary data from NetBackup that is instantly imported into TeamQuest. This runs at the top of every hour as a summary of the previous hour.
/usr/openv/netbackup/bin/admincmd/bpimagelist -U -hoursago 1
Output:
01/12/2012 18:35 02/02/2012 41904 3763745 N Differential Int_unix
01/12/2012 18:35 02/02/2012 42070 4150810 N Differential Int_unix
...snip...

Of course it can’t be imported into my TeamQuest database straight like that! The command prints out a bunch of data based on the number of different jobs that ran while TeamQuest really needs it summed up cleanly. So I wrote a perl script that runs the NetBackup command and sums it up, formatting it nicely for TeamQuest as a total kilobytes and number of files backed up. The TeamQuest required specific header fields are a time field (in quotes), an interval in seconds, and the servername. I’ve specified in my table definitions that I’m also providing another field for week of the year so that I can combine data for an entire week, and then the total number of files and kilobytes backed up.
An interesting note about the week-of-the-year field…. I have a bit in my Perl code that determines which week of the year I want it to be counted as. Most date modules will default to the week beginning on Sunday per the Gregorian standard, but for my backup standards the week really begins Friday at 6pm when the full backups kick off. Every backup after that should be an incremental or part of that backup set extended from Friday night.

A sample run from my script
# ./stats.pl -t -hourly
"1/25/2012 18:00:00" 3600 backupservername "3/2012"
185118 17720160

Sweet! If you see a message “no entity was found” don’t worry about it. It’s just a message from the NetBackup database command (printed to STDERR) that there wasn’t a job that particular hour. Zeroes will be imported for the data that hour.

So now my backup server runs an hourly job that imports this data into the Teamquest test database. We are looking good, going forward. But that’s only half the battle! I still need to get historical data into TQ so that I can make proper analysis.

I expand my Perl script so that I can pass in historical start and stop times at the command line.
# ./stats.pl -t -hourly -a "01/25/2012 17:00:00"
"1/25/2012 18:00:00" 3600 blade193 "3/2012"
394697 192310787


This is going great! I can run this a bunch of times for each hour of historical data that I need and append the output to a single text file. When it is done I import a single file into the TQ database and then make some pretty graphs.
So… let’s see. I’d like to go back about six months, so that’s about 180 days give or take, by 24 hours, ohhh, that’s running my command 4,320 times. Yeah… about that. I can hear Al say “I don’t think so, Tim”.

But, I really don’t want to extend my Perl script any more because this is running hourly already, and going so smoothly. If I keep hacking at it with my lowly coding skills I may break it or corrupt my data that I am collecting now. This is pretty much going to be a one-off straight forward linear loop to run this 4,320 times. Six-off at best, if I am willing to make a run per month with a few minor changes in between runs. This sounds like a shell script. Sure, I could do it Perl. But for super simple loops that is not parsing data I prefer to just use a shell script. It’s a square peg and this is a square hole.

Here’s my double loop shell script that runs my Perl script once per hour for each day of a month–
#!/bin/ksh

let dd=1
let lastday=31
let mm=07
let yy=2011

while [ $dd -le $lastday ]
do
let hh=0
echo " Running stats for day $dd" 1>&2
while [ $hh -lt 24 ]
do
echo " Running stats for hour $hh" 1>&2
./stats.pl -t -hourly -a "$mm/$dd/$yy $hh:00:00"
hh=`expr $hh + 1`
done
dd=`expr $dd + 1`
echo "Incremented day to $dd" 1>&2
done

Pretty simple, really. Oh, and I am sending status info lines from the shell script to STDERR output so that the STDOUT can be directed safely and cleanly into a file ready to import into TeamQuest but yet the sysadmin can easily observe how the script is progressing.

# ./makegoodhourly >import.august
Running stats for day 1
Running stats for hour 0
no entity was found
Running stats for hour 1
Running stats for hour 2
Running stats for hour 3
no entity was found
Running stats for hour 4
no entity was found
Running stats for hour 5
no entity was found
Running stats for hour 6
no entity was found
Running stats for hour 7

Make a few tweaks to the shell script to change the month number and the total number of days per month, and run it again. Easy. I ran it once per month for September through January and imported my data, and I was done.

Here’s the Perl script. It will default to daily stats if neither hourly nor weekly is specified. Why? Well that was just because that is was a middle step before I realized I needed to go hourly. I didn’t want to completely remove weekly or daily statistics for future possibilities.

I’m sure there are some better ways to accomplish the things I do in my scripts– I’d like to hear them in the comments below. I’m always eager to improve my skills.


#!/tools/perl/5_8_7/bin/perl

# 1.13.12 - ver 0 - K.Creason -
#
# To get weekly stats out of the NetBackup database

#
# First we define some things that are tunable
# The statcmd is the netbackup command that generates the output summary of
# all backup jobs based fields passed to it.
# We are going to use 168 hours ago for seven days to get a full weeks summary

my $statcmd="/usr/openv/netbackup/bin/admincmd/bpimagelist -U ";

# No more tunables, so these are some defaults that we will define for later

my $DEBUG,$VERBOSE,$filesummary,$datasummary,$files,$data,@data,$tqout,
$date,$dd,$mm,$yy,$weekly,$begindate,$weekno,$datespec,$hourly,$hh;

use Date::Calc qw(:all);

# process the command line arguments
if ("$ARGV[0]" eq "-h") {die "\n\nUsage: $0 [-d for debug] [-v for verbose stats] [-t for Teamquest format] [-hourly or -w for weekly summary] [-a MM/DD/YYYY for alternate start date, if hourly should include HH:MM:ss within quotes]\n\n";}
if ("$ARGV[0]" eq "-d")
{ shift @ARGV; $DEBUG++; print STDERR "Debug on.\n"; }
if ("$ARGV[0]" eq "-v")
{ shift @ARGV; $VERBOSE++; print "Verbose on.\n";}
if ("$ARGV[0]" eq "-t") { $tqout=1; shift @ARGV; if ($DEBUG>0){print STDERR "TeamQuest report on.\n";}}
if ("$ARGV[0]" eq "-hourly") { $hourly=1; shift @ARGV; if ($DEBUG>0){print STDERR "Hourly report on.\n";}}
if ("$ARGV[0]" eq "-w") {$datespec++; $weekly=1; shift @ARGV; if ($DEBUG>0){print STDERR "Weekly report on.\n";}}
if ("$ARGV[0]" eq "-a")
{
shift @ARGV;
$datespec++;
$date=$ARGV[0]; if ($DEBUG>0){print STDERR "Alternate date is \"$date\".\n";}
shift @ARGV;
}

if ("$date" eq "")
{
( $yy, $mm, $dd ) = Today(); $date="$mm/$dd/$yy";
if ($DEBUG>0){print STDERR "The end date is TODAY, $date.\n";}
}

if ($weekly lt 1)
{
$begindate=$date; if ($DEBUG>0){print STDERR "Begin date is end date, $begindate.\n";}
# need to add hourly check and if turned on calculate an end date of plus one hour
if (($hourly gt 0)&&("$date" =~/\:/))
{
if ($DEBUG>0){ print STDERR "Calculating an end date of plus one hour from $begindate.\n";}
my $cal,$time,$hh,$min,$sec;
($cal,$time)= split (/ /,$date);
($yy,$mm,$dd) = Decode_Date_US($cal);
($hh,$min,$sec) = split(/:/,$time);
if ($DEBUG>0){ print STDERR "Splitting end date to $yy, $mm, $dd, $hh, $min, $sec.\n";}

# Before we add an hour, check to make sure the start hour is two digits
if ( (length $hh) lt 2)
{ $hh="0$hh"; $begindate="$mm/$dd/$yy $hh:00:00"; }
($yy,$mm,$dd,$hh,$min,$sec) = Add_Delta_DHMS($yy,$mm,$dd,$hh,$min,$sec,0,+1,0,0);
if ( (length $hh) lt 2){ $hh="0$hh";}
$date="$mm/$dd/$yy $hh:00:00";
if ($DEBUG>0){ print STDERR "Calculated the end date of plus one hour to $date.\n";}
}
}
else
{
# weekly, so have to calculate a begin date
($mm, $dd, $yy) = split (/\//,$date);
if ($DEBUG>0){ print STDERR "Date ($date) is split year $yy, day $dd, month $mm.\n"; }
( $yy, $mm, $dd ) = Add_Delta_Days($yy,$mm,$dd , -7 ); $begindate="$mm/$dd/$yy";
if ($DEBUG>0){ print STDERR "Begin Date is calculated to $begindate.\n"; }
}

# Now we need to calculate which week of the year the backup stats belong to
# paying careful attention to use the weeknumber for Friday. So if the day of week
# is monday-thurs we take week the weeknumber of the previous Friday
# which is tricky if it happens to split a new year... Oy vey.
($mm, $dd, $yy) = split (/\//,$begindate);
my $dow = Day_of_Week($yy,$mm,$dd); if ($DEBUG>0){print STDERR "Day of Week is $dow.\n";}
if ($dow gt 4)
{ ($weekno,$yy)=Week_of_Year($yy,$mm,$dd);if($DEBUG>0){print STDERR "Week of year calculated for a Fri/Sat/Sun to be $weekno/$yy.\n";}}
else
{
# This is the more complicated route. First calculate what last Friday was and then the weekno of that day.
# Think we can just substract seven for last week
my $lyy,$lmm,$ldd;
($lyy,$lmm,$ldd)= Add_Delta_Days($yy,$mm,$dd,-7); if ($DEBUG>0){print STDERR "Date of a week ago is $mm/$dd/$yy.\n"; }
{ ($weekno,$yy)=Week_of_Year($lyy,$lmm,$ldd);if($DEBUG>0){print STDERR "Week of year calculated for M-Th to be $weekno/$yy.\n";}}
}

# sample data
# 01/12/2012 18:35 02/02/2012 41904 3763745 N Differential Int_unix
# 01/12/2012 18:35 02/02/2012 42070 4150810 N Differential Int_unix

if ($datespec gt 0)
{
$statcmd="$statcmd -d $begindate -e $date";
(@data) = map {(split)[0,3,4]} grep /^[0-9]/, `$statcmd`;
if ($DEBUG>0){print STDERR "Date specified command executed \"$statcmd\".\n";}
}
elsif ($hourly gt 0)
{
$statcmd="$statcmd -hoursago 1";
(@data) = map {(split)[0,3,4]} grep /^[0-9]/, `$statcmd`;
if ($DEBUG>0){print STDERR "Hourly command executed \"$statcmd\".\n";}
}
else
{
$statcmd="$statcmd -hoursago 24";
(@data) = map {(split)[0,3,4]} grep /^[0-9]/, `$statcmd`;
if($DEBUG>0){print STDERR "Daily/24 hour command executed \"$statcmd\".\n";}
}

my $a=0;
foreach (@data)
{
if ($a==0){$begindate=$_;$a++; if ($DEBUG>0){print STDERR "\tDate: $begindate. ";}}
elsif ($a==1){$files=$files+$_;$a++; if ($DEBUG>0){print STDERR " files now $files.";}}
elsif ($a==2){$data=$data+$_;$a=0; if ($DEBUG>0){print STDERR " data now $data.\n";}}
}

if ($tqout lt 1){ print "Files backed up: $files\nData backed up $data\n";}
else {
# Check for ENV Localhost
if ("$ENV{LOCALHOST}" eq ""){ chomp($ENV{LOCALHOST}=`hostname`);}

# if we are doing a weekly report for TQ it's a different time, at least for early testing
# and format, with the interval
if ($weekly gt 0)
{$date="\"$date 12:00:00\" $ENV{LOCALHOST} \"$weekno/$yy\"";}
elsif($hourly gt 0)
{
if ("$date" =~ /\:/ )
{
#then we have a time already, use it
$date="\"$date\" 3600 $ENV{LOCALHOST} \"$weekno/$yy\"";
}
else { ($hh,$mm,$dd)=Now(); $date="\"$date $hh:00:00\" 3600 $ENV{LOCALHOST} \"$weekno/$yy\""; }
}
else {$date="\"$date 12:00:00\" 86400 $ENV{LOCALHOST} \"$weekno/$yy\"";}
if ($DEBUG>0){print STDERR "DEBUG: $date\n$files $data\n\n"; }
print "$date\n$files $data\n\n";
}

I run it via two cronjobs on the backup server. One gives us a weekly summary via email, and the other is the hourly TeamQuest data import.

# test teamquest weekly stats gathering on Friday mornings
30 10 * * 5 /usr/openv/netbackup/bin/admincmd/stats.pl -w |mailx -s "NetBackup weekly summary" staff
#
0 * * * * /opt/teamquest/manager/bin/tqtblprb -d testuser -n NetBackupHourly >/dev/null 2>&1

And what does my data look like?
Backup data six months

Wow… there’s been a bunch more to backup lately.

My beautiful wife got me the Kindle Fire for Christmas. I’m having a lot of fun with it and really enjoy it. So much so I haven’t read a book on my Kindle Keyboard in days.
The Fire is great for PDFs with a lot of detail like the Pontiac repair manual for my Fiero, the installation guide for the stereo, etc. The Fire can zoom in and scroll around very gracefully. It’s perfect for that. And probably comic books if you are in to that sort of thing. I found a few digital versions of Tintin that I enjoyed as a child and like them on the Fire just as well. I imagine I will continue to find new uses in the days to come.

But I did find a weakness. There are no Google apps!
No Google Voice, no Google Plus, very little Google anything on the Fire or Amazon Android Marketplace.
Understandable I suppose. Google can’t very well send their apps to a competing app store…

The browser based versions of the apps work pretty well in the Amazon Silk browser but I want my push google-foo. I need them… And they can work if you are willing to do some steps.

I found several articles that got me close, but nothing quite on how to exactly what I wanted. I really didn’t want to break the Amazon media, cloud, or Market places. So easy does it and the result is that these appear to still be working for me. Some might think it’s a sacrifice to give up the Google Android Marketplace but I felt it would be more of a sacrifice to lose the Amazon connectivity.

I think I was able to achieve an equitable balance between both worlds– Amazon apps and marketplace but with my Google apps running. I suspect that I may be able to download APK’s from the Google market and install them manually. As a bonus I think I may have also found a nicer interface than the Carousel. There is more to be discovered!

Here are the steps:

1. Install “File Expert” from the Amazon Marketplace. It’s free and very handy.

This will allow you to explore the files on the Fire and ultimately install downloaded APK files.

2. In your Fire preferences go to Device, and then turn on “Allow Installation of Applications from unknown sources”.

This is necessary or the Fire will not install the apps from outside the Amazon market.

Now, we will Root the Fire — the apps won’t run right if the Fire is not rooted. I tried and Voice ran but nothing else.

So we will (but not yet) be following the steps at RedmondPie for rooting the 6.2.1 update.

First, your PC needs to be prepared to assist in the rooting process. I picked up these steps from another article that almost completely does not apply to this version of the Fire.

3. Install the latest Java Development Kit from Oracle on your PC.

4. Download and install the latest Android Development Kit on your PC. The download part is easy. However the install part does require checking to make sure you install the USB options. I recommend using this article but only steps 6-8 to get it installed and the USB driver setup correctly.

5. I had to update the device driver for the Kindle in Windows. Plug in the Kindle if it isn’t and bring up Windows Computer Management (right click on “My Computer” and select Manage), expand Devices and locate Kindle. It was yellow/exclamation mark because the driver was generic. Right click and select Update driver, choose the option to specify a driver (not online) and browse to c:\Program Files\Android\Android-sdk.

6. Now we will get Root! This article at RedmondPie works perfectly for rooting the 6.2.1 update. Well, almost. They glossed over some steps that beginners need.

Read the article and download the two files specified (BurritoRoot APK, and the Rootzwiki). Unzip the Rootzwiki so that you can specify the path to the files from the command line. They can go into your downloads folder or into a shorter path like c:\temp.

Copy the BurritoRoot APK file to your Fire. When the Kindle is connected it should have a Drive letter. I copy all of my downloaded APKs to the “downloads” folder on the Fire so do this now.

Disconnect the Fire and install and execute the app on your Kindle using File Expert. However, ignore the instructions from the Police app that pop up on your Fire. Follow the directions at RedmondPie closely, as they expand on what the Burrito Root APK will tell you when you execute it on your Fire. The “push” and “install” commands will need to the path to “su” and “SuperUser.apk”.
It will look like this:

C:\Program Files\Android\android-sdk\platform-tools>adb install \Users\kevin\Downloads\Android\superuser-2\Superuser.apk
1143 KB/s (785801 bytes in 0.671s)
        pkg: /data/local/tmp/Superuser.apk
Success

7. Now that your Fire has rebooted you should be running as root. Check that your Amazon apps still work before proceeding. The most difficult part is over. Take a breath, relax, make a mocha. 😀

8. I found the following article about getting the Google Framework app pack installed, and while it mostly did not apply since their goal was to use the Google Android Marketplace there are some important steps that I used, and a download. My understanding is that following their steps completely will break the Amazon market and apps so only use the first step of the article and the link to gapps.

So, download that RAR file and uncompress it with 7-zip. Inside the resultant folders are APKs for Google apps. There are two for Google Framework (Framework and gsf2) that you will need, and whatever else you want. I installed Mail (gm2), Voice, Talk, Plus, maps, and street– but not the Vendor market place. I did install the gau Launcher APK and I think I like it better than the Carousel.

Did you try it? Did this help? did you have different results? Post your results or comments here.

Posted: January 3, 2012 in android, tech
Tags: ,

Boot and Nuke for SPARC

Posted: November 29, 2011 in solaris, tech
Tags: , , , , ,

I’m preparing to excess old hardware at my day job– it’s a very satisfying turn of events. It means we’ve had a job well-done and replaced old hardware with newer, faster, shinier stuff and we can say buh-bye to the old slow crap!

Some people will just pull the hard drives and run them through the degausser which turns them into useless lumps of metal and poisonous stuff that we don’t want to go to the landfill. I prefer to run a program on the drives to completely remove their identity to DOD standards. Afterall, this is the government and the old hardware will go an auction where you can buy a pallet of hardware for $20. I’d like the buyer to receive the hardware with working drives, not poisonous metal, and save the degausser for hard drives that die (but still have data on the platters).

There’s a free product called Darik’s Boot and Nuke that works fantastic on Wintel type machines. I can send a Windows or junior admin over with a CD, DVD, or USB drive to boot and nuke specific machines. But nothing for Sun (aka Oracle) SPARC last time I looked. I admit, I last googled for a SPARC boot and nuke about six years ago– I haven’t looked this time because I have a script for that (I should trademark that phrase!).

I save this script to my jumpstart server and configure an install profile to run it as a pre-install script. This script uses the builtin Solaris “format” command’s ability to run a series of commands and the “purge” function to completely erase each hard disk to DOD specs.

The steps are —

add the MAC address of the SPARC system to your jumpstart server’s /etc/ethers file with the host name “wipeme”.

add an unused IP address to the jumpstart server’s /etc/hosts file with the host name “wipeme”

Create a directory tree on the jumpstart server that is /jumpstart/install/wipe. In the wipe directory you need to have a very generic “profile” file and a “sysidcfg” file as required by jumpstart or it won’t build the rule checksum.

sysidcfg
auto_reg=disable
name_service=None
nfs4_domain=dynamic
system_locale=en_US
timezone=US/Central
network_interface=primary {netmask=255.255.255.0 protocol_ipv6=no default_route=172.10.11.1}
terminal=vt100
security_policy=NONE
root_password=rubberduckY
timeserver=localhost
profile
install_type initial_install
system_type standalone
cluster SUNWCuser

Create the script in the wipe directory, or a subdirectory such as pre-install.

The wipedrives.sh script is very simple and that is just perfect! It finds all device types for hard disks and by simple elimination eliminates ROM drives. The only thing is I am not sure about for the next generation of SPARC systems with SAS drives and their wonky C-numbering. But, hey, I’ve got five years to cross that bridge.


#!/bin/sh
#
echo "analyze
purge
quit
backup
quit
quit
" >>/tmp/fcmd

CMD="format -f /tmp/fcmd"for i in ` ls /dev/rdsk |cut -f1 -d"s" |sort |uniq`
do
echo "Executing command: \"$CMD -d $i\" \n"
$CMD -d $i
done
$CMD

Edit your your /jumpstart/rules file and add an entry like so: "hostname wipeme install/wipe/pre-script/wipedrives.sh  install/wipe/profile - "

Now run your "check" routine to build the rules.ok checksum and you are off to the races to "boot net - install" just like normal. The only difference is it will purge the drives and not install anything leaving a tabula rosa for the new owner.

Unix systems have some great command line tools: Find, grep, cut, split, tr, sed, awk — all amazing tools.
But sometimes I still can’t quickly see what I need to see in a fast scrolling window. The text is the same font and color, and the background never changes. When scrolling through using ‘less’ and using the search option the word can be bolded or in other ways marked– but ‘less’ is not always useful when you are looking at the output of ps, tailing a logfile, top, prstat, snoop, tshark…
Well I finally found something. Cobbled together something is a little closer to the truth.

Sometime back I found a suggestion on using one of my favorites to change color: perl. This hack is pretty easy but involved remembering the very complicated one-liner to type it when needed, or pulling it out of history, or little notes files in my home directory. It uses the color mechanisms within the terminal, so your terminal and shell naturally have to support color.

It would go something like this (where 31 is red and 43 is yellow [I think]): tail system.log |perl -pe ‘s/Throttling/\e[1;31;43m$&\e[0m/ig’

tail system.log | perl -pe 's/Throttling/\e[1;31;43m$&\e[0m/ig'

Perl one-liner typed in command pipeline

It definitely was a start. But it wasn’t easy. I used it for years this way– look at the file or output and realize my eyes are lost. So I would cat my note file in my home directory, run the command again piping it through the perl one-liner to highlight the word I wanted (like “Throttling” in my example).

One day I had a enough. “Kevin, there has to be an easier way,” I said to myself. I messed about with trying ‘alias’ commands to set it up. That was fine for a static word to highlight but it wasn’t possible to stick in a variable to highlight different words when I needed to.

So.. I went back to shell basics and rediscovered “functions”.  By building several shell functions using my original Perl one-liner with color changes and different variable names I can now highlight multiple different words all at the same time:

Functions highlighting multiple words at the same time

Functions highlighting multiple words at the same time

Pretty awesome!

So here’s how it is setup. First, I find some colors that look in my terminal. I use the “Novel” color scheme in my Mac terminal found these three useful color schemes but your could easily have more and change them to match your heart’s desires:

  • Red on Yellow : 31;43m
  • Lt Blue on Dark: 32;44m
  • Lt Blue on purple: 32;45m

Next up is combing them with individual variables and putting them in my .profile so that they are active when I login to a system. I edit my .profile and at the bottom I add these three lines:

  • redy () { command perl -pe “s/$redy/\e[1;31;43m$&\e[0m/ig” ; }
  • blue () { command perl -pe “s/$blue/\e[1;32;44m$&\e[0m/ig” ; }
  • purp () { command perl -pe “s/$purp/\e[1;32;45m$&\e[0m/ig” ; }

The first word is the name of the function. This can be called just like a command or an alias, but it calls the command inside the brackets which is the original Perl one-liner modified with a variable. The variable is not a Perl scalar but is a shell variable that will be replaced with the contents before Perl executes.

Once you’ve edited your .profile you need to exit and re-login or source the .profile into your environment. You can type ‘set’ to see if it is in your environment before attempting to use it.

When the function is built and in your environment you are ready to add it to your pipeline. The pipeline must start off with defining the variable and within the same session execute your command before handing that off to the pipeline. The way to do this is with an “&&” joining construct. It tells the shell “set this and if successful do this” and the pipeline follows so the whole enchilada is fed to the next command. It’s not complicated, just messy to describe. So let me show you:

  • blue=5004 && redy=5007 && purp=5008 && snoop -Vd ce0 port 5007 or port 5008 or port 5004|redy |purp|blue
    • set blue to be the word 5004
    • set redy to be the word 5007
    • set purp to be the word 5008
    • execute snoop command
    • pipe feed output to function redy
    • pipe feed output to function purp
    • pipe feed output to function blue
  • Sip coffee and watch your magic!

    Example Execution of commands

    Example Execution of commands

It is still a fair bit of typing and still requires some biological memory  but it is easier.

I hope this helps someone!