Monday, December 31, 2012

All work and no play...

So far, this blog has been related to what I do work wise.  Well, it's time to shift, especially for the new year.  Time to start posting on my hobbies and other activities, and how I use Linux tools to do various hobbies.

My (current) hobbies:
  • Scuba diving
  • Photography
  • Wood Working
Sadly, I have pretty much neglected all of these due to time constraints (work, house remodel/repair, life, etc).  But this year, I have at least been able to do some decent photography with two new cameras (Canon Rebel T1i, GoPro Hero 2), and even combine this with scuba diving to produce some stunning raw video.



With the Canon, I shoot only raw images.  This way, I have more control over editing.  If I need to convert them for posting, I can do it faster and more accurately on the computer anyways.  For the GoPro, I am still getting the hang of it, but I think I may stick with 1080p for most video.

So, how does this relate to Linux????

I am currently focusing on a large collection of digital photos taken from our recent Caribbean cruise.  Every afternoon, I would gather SD cards from everyone in our group (5 people, 7 cameras) and download them to my netbook.  Some from cell phones, some from point & shoot cameras, and then my two cameras above.  In total, I have ~1400 photos and ~12 hours of videos to go through.

I use Digikam to manage all of my photos into albums.  For this trip, I made a central album, followed by sub-albums for each individual's photos.  When I upload them, I can mass-edit the meta info with locations, dates, who did the shooting, etc.  This comes in really handy later on when I want to assemble them into albums by date or location (for example, all photos from everyone taken in St Croix).


Before editing
Unfortunately, not everyone in our group is an avid photographer (including me).  Face it, we all make mistakes when trying to go for excellent photos.  For example, my mother, who used to be in a photography club (not sure if she still goes or not), recently switched to digital, using a Canon Powershot (I know this because Digikam recorded the camera information in the meta data - already a plus).  Unfortunately, she left the date stamp turned on (in camera club, they call that a noob fail).  So I have to edit.

Date stamp removed
For editing, I am using a powerful program called the GIMP.  I haven't used it before, but I am learning fast.  Through the power of Google, I have found tips on how to remove this annoying "feature". Important tip for digital photograpy; turn off the date stamp.  You can always add it later in a less intrusive location and better format (Digikam can do this in batch mode).

Another issue we all had was taking photos over water.  One rule of good photography is that you want to keep the horizon level (otherwise the ocean flows out the side of your picture, right).  After removing the timestamp from this photo, I was able to fine tune the rotation by -10 degrees, then trim the edges by ~20 pixels.  This was very quick and easy (and as an added bonus, it also re-blended the area where the timestamp was making it completely disappear).



What about the case of a cell phone camera that somehow had its color balance out of adjustment?  Here is a nice photo taken by another member of our group on her Android cell phone.  Note the over-saturation of blue.

While there is probably a better way to fix this, I went with a vintage black & white photo conversion.  Here's the result:

Ok, back to editing.  I still have a LOT of photos to sift/sort/edit/mutilate.  I should also point out that for the Windows users, these tools are also available for you to download and install.  Just follow the links above.  Photos can be viewed here.

My goal is to finish editing this pool of photos down to the best ~600-800 and then I will work on a video slideshow DVD for the family members that were there.  I'll blog about the tools I use for that when I get to that stage.  I'm already looking at combining the best tools available to produce one stream.  Digikam has an excellent  slideshow, as well as LibreOffice, and other open source tools.


Wednesday, November 7, 2012

No disassemble Johnny 5

With little fanfare, I decided to dismantle my 5 pandaboard test rack.  The problem was not with the systems, but with my lack of time.  It has become too time consuming to maintain the infrastructure surrounding these and other arm platforms.  The only real reason I kept them online was to continue to improve my test automation skills, largely to prove that I do know how to develop automation testing, a bit of a sore point with my previous job.  I also used them to troubleshoot issues from the Ubuntu-arm community, but lately I feel that my work there continued to be disregarded and not wanted.

The good news is that I have taken this knowledge and applied it to projects at my current job, to (in my opinion) an overwhelmingly high level of praise.  I now have automated builds for one of the projects, and am working to expand that to full build/test/publish capabilities this week.

Still not sure about my long term outlook.  I have a lot of mixed feelings about Intel (bad history), and there are times when I feel that just being there drains my sole.  But, like all jobs I have worked at, I am giving it my all.  Kind of hard to work through the anxiety issues that keep coming up, along with my continued lack of self-confidence.  But it is a job, and the people I work with seem to like me.



Thanks, Al.  Of course the one skill that continues to flourish is my inane ability to do self-psychoanalysis.  Yay.



Thursday, October 25, 2012

Time flies when you don't know what you are doing.

Has it really been over 4 months since my last post?  Oops.

Well, the project I started on in my previous post has tapered off, as we are waiting for pre-production development hardware (the last batch was very alpha, mainly for enabling code development and pathway debugging).  In total, I wrote 10 scripts plus libraries to automate testing various facets of this hardware.  The data generated helped us fine tune the development of the new platforms, which we should see before Christmas.

While I wait, I am helping out with other projects, and learning about new technologies, like system level NUMA support (allows a system to assign memory to a specific cpu or for multiple cpu's to share dedicated memory with each other).  Very different for someone with a more mobile testing background.

With the onset of heavy rains, my outdoor high-priority projects are now finished to the point that they are no longer critical, so I can limp through another winter while I work on inside projects.  One of which has been the maintenance and improvement of Win32 Disk Imager, a project my son did for me when I started at Canonical.

The current focus has been in closing all current bugs, most of which have been fairly simple to resolve.  I'm debating whether or not to release another point release (0.7) with these minor fixes or hold off until December when I plan on releasing a 1.0 with added features and improvements (line an installer - wouldn't that be cool).  I am also in the process of moving the project to SourceForge as this will give me better control of the project (Git, Wiki, Forums, Web Pages, Statistics, etc).  The release process alone was a major headache that took hours for each release.  I am absolutely amazed at the total downloads this project has (over 4 million downloads across all versions - 1700 since the sourceforge site was created two weeks ago).

Adding the new features users want to see will require learning yet another language (C++), plus it requires learning the Windows programming interfaces.

Sigh.

Friday, June 15, 2012

Doing what I do best

Finding the bugs that piss people off.  Yes, that is what I do.

After mucking around in TCL for several weeks (and seeing how horrid it was - at least the built-in version for these fpga development systems), I have a minimal tcl script that initializes the platform and then takes user input to change the fpga settings on the fly or read registers on the fpga for status and data connectivity.

The real automation is in python.  In only my third python script ever, I have it controlling the tcl system fairly extensively.  So far, it has helped find several bugs in the fpga development environment (like sudden and unexplained crashes), bugs in the fpga register behavior (why does reading the setting in one register erase all other data?), and even possibly other bugs in our pre-release development system (not to worry, this is why it is pre-release).  I still need to figure out how to recover when the tcl app crashes (currently it takes the python script down with it).   Easy enough to restart where it left off, but annoying.

The cool thing about this scripting (at least from the team's point of view) is how much it has helped speed up their work.  Before, to change one setting (knob) in the FPGA would take as much as 20 manual operations.  Spread that across 4 different knobs (each having 4-16 different settings) and 24 different data lanes, and that is a lot of manual tweaking.  My script so far loops through all permutations in about an hour, reading a control register over 1000 times per setting to ensure low error rates, then listing the best combinations.  Kind of like tuning a surround sound stereo for the best audio experience by tweaking the volume, bass, treble, and distortion on each individual speaker.  This is still only scratching the surface of what I will be developing, but it is a start.


As to the bugs pissing people off, they aren't mad at me for finding them.  It is after all what I was hired to do. 

And I am very good at it.

Saturday, May 19, 2012

There and back again.

Life appears to be full of little twists and turns.  Four years ago, I finished my last contract at Intel, then spend the next few months finishing my degree.  The week I graduated, I accepted a job with Canonical, initially doing QA testing on netbooks (for the same team I worked with while in my last Intel contract), then switching to ARM testing when our team spearheaded that effort for Ubuntu.  That began best job I have ever had.  Then it ended, rather abruptly.  Largely due to market shifts and stabilization of the ARM port (our team was disbanded and moved into other platform teams, I just didn't find a spot to land in).

Less than two weeks after being let go from Canonical, where do I find myself?  Back at Intel.  And (as seems to be a running theme here), I found myself in way over my head.  Like treading water with lead boots on.  To say it looked dire is...well...putting it mildly.

My new job involves writing automation scripts in TCL to program and monitor FPGAs.  On Windows 7.  I will spare you readers of the gory details (largely because I am still wrapping my remaining brain cells around them myself).  The first 7 days, I seriously wondered why I was picked for this particular position (and why I accepted).  It wasn't like I didn't have other offers (average of 10 a week for Linux jobs).  This was way out of my comfort zone.

To start, I knew next to nothing about TCL (I'd heard about it, nothing more).  So, Monday I bought a book (Barnes & Noble rocks - shameless free plug).  By today (Friday), I have figured out enough of the idiosyncrasies of the language to get my first script working (read FPGA state, reprogram, check for errors).  The next step will be to make this script easier to interact with so it can be called from a master program on the fly.

I could have been much further along, but the first 1.5 weeks were largely spent with getting basic necessities (accounts, lab access, cube, still waiting for a standard issue laptop, etc).  The funny part is that most of the information in the system about me was based on my last contract, so manager approvals were misrouted.  Oops.




I still feel like I am in over my head, but at least I now have a rock to stand on.  I am really starting to feel like I can do this job, and do it well.

But I still hate Windows.  Just saying.

Friday, April 20, 2012

What does three up and three down mean?


Well to some, it is the end of an inning.

In my case, it is time to move on.

In my job, I was doing;
Only QA tester on Arm desktop,
Broke ground on QA testing on Arm Server,
Interacted as both tech support and community liaison with multiple communities of Arm Linux developers,
Automated Arm network installation and SRU testing,
and many other day-to-day activities.

I also would root cause as many bugs as I possibly could, often replicating them on x86 and amd64.  I even fixed a few bugs along the way.

But, due to shifts in work and internal reorganizations, I am no longer needed.  Yes, today is my last day here (well, maybe not on this blog).  I only recently opened up my resume to job hunters, and the calls are coming in rapidly.  I hope to land somewhere soon, possibly as soon as next week.  Seems Linux QA people are in high demand still.

But what about all my equipment?  What about the dedicated tower of Pandas?  For now, they are now busy searching for ET.  I might have future work for them next month.  They seem to be able to perform quite well, even keeping pace with the SS Itanic I have sitting here.  So what if they aren't as powerful as the multi-core 64 bit laptop warmers that most people have.  I'll bet they draw less power than the Speak & Spell that ET uses to make long distance calls w/o a carrier network plan.  And they support bluetooth.

So, I leave you readers with this:  "You have entered a dark place.  You are likely to be eaten by a Grue!"

Friday, March 2, 2012

It's alive!

Since my last post, I have been overly busy with test automation, fixing bugs, learning python (able to debug a lot of code issues, still need to start my own code projects), etc.  It has been shear and utter chaos.

Part of this has been lab expansion.  I am now up to 8 Pandas online, plus additional hardware (some of which I can't comment on, but it is COOL!).  In preparation of arm servers, I am in the process of gathering data on power consumption while running tests.  This involves a really nice Fluke digital multi-meter with data acquisition.  To simplify testing, I am using a Panda as a basis, as the hardware is now well known, and I can easily put the meter in between the power supply and the panda. 

Wires, lots of wires.

This is a desk test with an old beagle that I currently don't use for testing.  Figured if I was going to fry a test board, make sure it isn't in use.

So, with the wiring working, the next was to run a semi-meaningful load test.  Since this is ultimately for server testing, I wrote up (with the help of others) a simple lamp stack test that installs apache2, php, and mysql, then loads & verifies mysql with a large amount of dummy data, which php will query from.  Once that is setup, the test runs apache bench for a few iterations before really clobbering it with siege.  It took some fine tuning to scale the test down to run under 1 hour (first try ran all weekend before I manually killed it - oops).  Below is the graph of the data from the power meter, starting from idle.


The first bunch of noise (0-600s) is  apt-get install lamp-server^ installing the packages, followed by a slight increase as the database is populated and verified.  The spike from 750-1100 is apache bench.  The rest is siege.  The system is a panda running headless with a 16G SSD on usb sata, powered by the panda.  Everything was run locally. 

Note:  This is not intended to be used for comparison, only a system I have available.  No other systems have been tested, and I'm not even sure the tests I ran can be considered valid given the setup.  But for those interested in how much power their cell phones will consume when acting as a lamp server, here you go.  Enjoy.

Not bad for a first run.  And fairly easy with a 2 wire power source.  Now for the fun part.  To do meaningful power benchmarking on any platform, you really need to be between the power supply and the platform.  Testing at the wall is near meaningless.  The kill-a-watt I have on the wall jack behind my rack cabinet shows I am only using ~5 amps, and that is with 4 low power systems running full time.  Another one on a Core2Duo open test system (just a motherboard, video card, and drives - no chassis) shows ~.45 amps between PS & wall when under test.  It drops insignificantly when idle.   So, to properly test, I need to sit between the power supply and the board.  Now, the trick is to not overload the 10A max load on the meter.  AC, no problem, but my 450W atx power supply says  it will handle 22 amps on 3.3v, 15 amps on 5V, and 30 amps combined on 12V.  Time to break out a cheap power meter (in case of smoke).

This is going to be...{fun, expensive, deadly, ____}.

We will see.