Sunday, December 6, 2009

Another Year of BCS Failures

Last night I sat on my sofa and watched with fascination--but not surprise--as two of the so-called top teams in the nation were embarrassed in big games. Of course, I refer to Florida's 32-13 debacle and Texas's 13-12 escape (which was also a debacle). Yesterday's events underscore the real problem that we have with the BCS/polling system used in college Football at the FBS level. Take this scenario into consideration:

A team has a stellar season, wins their conference and goes on to have a very solid win against a formidable opponent in the bowl game. The following season, most poll voters are very "high" on this team and give them a high ranking in the preseason polls. After a few weeks of play and good performance, this translates into an equal or similar slot in the BCS rankings.

This team goes on to run the table in a conference that turns out not to be as good as expected--top-heavy, indeed, with this team being at or near the top. They didn't prove as much on the field as they should have due to competition that was decidedly sub-par, but because they didn't lose to these other teams, the BCS kept them from falling below their inflated preseason position due to the importance placed on what teams have in the "L" column.

Then, after a season of trouncing any and all opposition, the team reaches the championship game: after so many months of media hype and a zero in the "L" column, the team shows up convinced of their superiority and possibly even slightly complacent. Maybe mix in a key injury during the game. The result is an ugly, ugly night. The team is exposed as a group of "frauds". Not capable of living up to lofty expectations. Charlatans. All of this because the team was propelled to a high position largely because of opinion (beginning with those preseason polls) rather than a résumé built on the field against quality opposition. This team, due to a lack of real challenges on the field during the season, did not progress as much as was needed to meet the challenge that came in the form of the opponent in the championship game. And boy, were they ever punished for it. Not only on the field, but afterward in the form of a critical media and gloating fans of the opposing team.

Who is this team, you ask? It's 2006 Ohio State in the BCS National Championship game. It's also 2009 Florida in the SEC Championship game; 2009 Texas in the Big 12 Championship game. Take your pick.

This kind of nonsense is exactly why the BCS should go. By nature of its very flawed design, this system produces teams like what have been described above. Voters decide which two teams are "best" at the beginning of the season and--barring upsets, which don't often happen when conferences lack parity--the BCS supports what is essentially an educated guess. Last night's games showed that those guesses were wrong. Florida, Texas, and other teams were essentially crowned among the nation's best based largely on last season's performance. Take from them the quality of competition that was expected (which is true this year in both the Big 12 and the SEC) and what do you get? A team that is ill-prepared and ill-progressed late in the season. A recipe for disappointment.

The BCS must go, but it probably never will because the powers that be are making far too much money on the way things currently work, which is a discussion for another day.

Wednesday, November 18, 2009

Browser Boot Camp

Greetings, fellow MindLeaders!

I was recently fortunate enough to lead a training session at the office. Given the fact that my employer is a company whose product is a web app, I saw an opportunity to train co-workers about what is going on with the World Wide Web and web browsers today (and tomorrow, for that matter). The one-hour training session was referred to as Browser Boot Camp and was a success, thanks in large part to the enthusiasm of the attendees.

The presentation can be downloaded in the following formats:

ODF (Open Office) | PDF

(You may need to right-click and choose Save Target As).

Creative Commons License
Browser Boot Camp by Rick Ucker is licensed under a Creative Commons Attribution-No Derivative Works 3.0 United States License.


Monday, November 9, 2009

Bitpim & Bluetooth Under Ubuntu Karmic

Last week, I gleefully installed Ubuntu 9.10, the Karmic Koala, on a new laptop. One of the first things on my to-do list was getting Bitpim set up so I could transfer ringtones & contacts between my laptop and my cell phone (a Verizon Wireless LG Dare).

To accomplish this, I looked up an old tutorial I had followed a while back for setting everything up on my old desktop machine.

I have found that a couple of things have changed since that guide has written, so I thought it would be worthwhile to post an updated version and share my findings.

Getting Started

Everything in this tutorial is possible under Karmic without installing any packages that aren't installed by default (except for Bitpim, of course).

This works for my LG Dare (Firmware version 05) and should work for many/most/all other Verizon LG phones that support the proper Bluetooth profiles -- possibly many more beyond that.

Just to keep things relatively short, we will assume that you have already managed to pair your phone with your computer.

Unless otherwise specified, assume any commands I ask you to issue will be done in Terminal.

Step 1: Find your phone's MAC address

The easiest way to do this is to browse the device via Nautilus: from the Bluetooth icon in the Gnome Panel, choose your device and then choose Browse Files. This will open a new Nautilus window and your phone's MAC address will be displayed in the address bar in the format obex://[mac:address:here]/.

Step 2: Find the channel used by the Bluetooth Serial Port service

Issue the command: sdptool browse mac:address:here.

My Dare uses Channel 5 for Serial Port as shown here:

Service Name: Serial Port
Service RecHandle: 0x10006
Service Class ID List:
  "Serial Port" (0x1101)
Protocol Descriptor List:
  "L2CAP" (0x0100)
  "RFCOMM" (0x0003)
    Channel: 5

Note: Some phones may not show anything labeled "Serial Port". In the case of my old phone (an LG VX9900), for example, I used the BT DIAG service instead and it worked. If you don't have a Serial Port section, you may have to try another service instead (but no guarantees it works).


Step 3: Create an RFCOMM binding by editing the appropriate config file

Open the config file by running: gksudo gedit /etc/bluetooth/rfcomm.conf

Uncomment the rfcomm0 section (remove the # from each line) and change it to look like this:

rfcomm0 {
    # Automatically bind the device at startup
    bind yes;

    # Bluetooth address of the device
    device mac:address:here;
    # RFCOMM channel for the connection
    channel 5;

    # Description of the connection
    comment "Needed by BitPim";
}


Step 4: Make sure your bluetooth bindings are added at startup

Due to a bug in 9.10, the rfcomm bindings that we specified above (in rfcomm.conf) will probably be ignored at startup. Reply #5 in the LP bug report provides a workaround for this issue. The instructions there are written for users who have upgraded to 9.10-- on my fresh 9.10 installation however, I needed only to follow step 3, which I will repeat here.

Open rc.local (gksudo gedit /etc/rc.local) and add the line:

rfcomm bind yes

Above the last line (which should say exit 0).

Step 5: Restart the Bluetooth service


Issue the commands:

sudo /etc/init.d/bluetooth stop
sudo /etc/init.d/bluetooth start

Note: running sudo /etc/init.d/bluetooth restart did not work for me: this caused the BT service to stop, but it wouldn't start again on its own. I had to issue "start" to get it to start again.

Step 6: Start Bitpim and check available com ports


Start Bitpim and go into Settings, then click the Browse... button to the right of "Com Port". You should see "Bluetooth (/dev/rfcomm0) listed under Available Ports. If you do, you are all set! You can now start reading/writing your phone data using Bitpim. If not, you may have done something wrong. Double check your work above and try again before continuing.

Step 7: Have fun hacking away at your phone!

If you have any questions or run into problems, please be as specific as possible when asking for help.

Thanks to:

Ars Technica
Tony

Wednesday, October 21, 2009

Getting Started with Eclipse

For about the last month I have been learning Java. Due to outside requirements, I have been using the BlueJ IDE for development. However today I decided to give Eclipse a try.

I installed version 3.2.2 from the Ubuntu repos and loaded a project I had just completed in BlueJ. It was then that I was greeted with a few errors regarding my code. The first of which was:

    Scanner cannot be resolved to a type

The line of code in question was:

    private Scanner userInput = new Scanner(System.in);

Strange, I thought, since I had made no changes to the code and had just compiled and run this project in BlueJ.

I first checked the installed version of Java:

    $ java -version
    java version "1.6.0_0"
    OpenJDK Runtime Environment (IcedTea6 1.4.1) (6b14-1.4.1-0ubuntu11)
    OpenJDK Client VM (build 14.0-b08, mixed mode, sharing)


    $ javac -version
    javac 1.6.0_0

Ok, everything looked good there. I checked the compiler settings in Eclipse and changed the compiler from 5.0 compliant to 6.0 compliant. No joy.

After some more digging I concluded that the issue was probably that Eclipse was not using the proper JRE. Sure enough, Eclipse was using version 1.5 at /usr/lib/jvm/java-1.5.0-gcj-4.3-1.5.0.0. I added another JRE entry pointing to /usr/lib/jvm/java-6-openjdk (which is what my BlueJ installation was already using) and voila! No more errors.

I guess I should look into cleaning up the multiple JRE's I have installed on this machine.

But first, back to exploring Eclipse!

Monday, October 12, 2009

User Database Manager

As I wrote about back in June, I have been getting back into programming this year. I started off in Python and it went well. Approximately two months in, I was able to write a simple database manager. It actually started as a response to LaRoza's Beginner's Programming Challenge #7 and turned into a bit of a monster as I added more and more functionality once I satisfied the initial requirements. The program is written in Python and uses a sqlite database. The requirements are pysqlite 2.x on the client machine in addition to an installation of Python.

I figured that since the program is robust and feature-complete (albeit limited in scope), it would be worthwhile to officially release it under a GNU GPL. This will happen once I figure out what hosting service and/or version control system to use.  Hopefully, someone out there in learner-land may find it useful.


Thursday, October 8, 2009

Flickr

After months of neglect and with a little bit of inspiration from the Cellular Obscura blog, I decided today to revive my Flickr account. My photostream has been added to the sidebar. Hopefully I'll be able to fill it with some noteworthy photos.

Strange Display Issues

Recently I have been having a very strange issue with my desktop box. After I log in (sometimes immediately, sometimes after just a few minutes), menus stop appearing. This includes the Gnome-panel menus as well as the menus in any program I may be running or the right-click context menu in Nautilus. When I click a menu, its color changes to indicate that it has been clicked like it normally would, but I don't see the menu.

If I am running any program such as Nautilus or Totem when the problem occurs, I can keep using it normally (as long as I don't need to access a menu or right-click anything), but the only way I have found to resolve the issue is by restarting X (after having installed dontzap and re-enabling the ctrl+alt+backspace shortcut).

Also-- I use Gnome-Do's Docky interface. I noticed that if I click an icon in Docky while this problem is going on, the icon grows like normal then nothing happens: the program does not launch and the icon doesn't shrink back to its normal size either.

So, this seems to be some weird problem with my desktop not refreshing or something. I do not recall making any changes recently, other than installing a few updates when prompted (sorry, I don't recall which ones, but this came up about 2-3 weeks ago).



Over the weekend I installed a host of updates (including some kernel updates and the problem seemed to go away. Unfortunately, that proved not to be the case: I'm still having the problem, although I have some more information about what seems to be going on. 

It all seems to be related to an episode I had a few months ago after I inadvertently allowed my $HOME partition to fill up. I wrote about it in all its painful glory here.

Since that time everything has seemed fine, except weird issues I've been having in Banshee. I noticed after I addressed the issues caused by the full partition that whenever I launched Banshee, it would never remember my view settings (it did before the problems I had). For example, one view pane would always be reduced to about 5% of the viewable screen instead of the 60/40 split I would always use. Every time I launch Banshee the first thing I do is resize the view panes to the way I like them, but it never remembers the preference (again, it used to remember).

The display issue I wrote about in my first post seems to have started coming up after Banshee locked up when I paused a movie I was watching. I killed the program with the Force Quit applet. I think that this is when the display issues started.

As I mentioned above, the problems seemed to stop when I installed some updates (including a Kernel update). Tonight I re-installed the latest kernel through Synaptic and the issue seemed to stop again.

I'm going to play around with Banshee and see if I can cause this to happen again. What I'm wondering is whether what I'm describing about Banshee and the other issues ringing any bells to anyone. Am I touching on something here? If I am able to cause this to happen by causing a Banshee lockup again, I guess I could try to back up my banshee.db then completely remove/ reinstall the program and see if it performs normally again.

Thursday, September 24, 2009

Following Up on that Big Game

Well, everybody knows what happened in the Horseshoe on September 12. It was, for the most part, what I expected aside from the outstanding performance of the OSU D (except for that last drive, of course). I won't bother with a full breakdown of the game but suffice to say, smartfootball.com's Chris Brown said it far better than I could.

RIP Tresselball. Let's just hope that The Senator is aware.

Friday, September 11, 2009

Showdown in the 'Shoe

Twenty four hours from now, the Ohio State University football Buckeyes will be battling the USC Trojans on their home turf. The men in the Scarlet and Gray will be fighting not only to avenge last year's demoralizing defeat, but also to redeem their deflated national reputation (not to mention that of the Big Ten). Ostensibly much bigger underdogs in this year's game than in last year's, the Bucks will have their hands full on both sides of the ball in this game.

The good news for the Bucks, on a high level, is twofold: the Trojans run a pro-style offense, which OSU is much better at defending than say, a triple option or spread-option offense. Also, Ohio State played this same offense (with a few different players as compared to this year) last year. The same can't be said for USC: OSU's offense last year was also largely a pro-style offense, crafted for the now-departed Todd Boeckman. This year things will be much, much different with Terrelle Pryor at the helm. This will be a new offense with new looks run by a different style QB. One could even say that Pryor is Boeckman's antithesis.

Keys to the Game

Ohio State and USC match up fairly well when it comes to their skill players. True that USC has more depth and their starters are largely more experienced, but OSU's pool of starting talent--raw talent-- is fairly similar to USC's. The difference, moreso than any other area, will be how well the Bucks fight the battle in the trenches. That's right, the keys to this game are all about line play. It is no secret that OSU's line play-- that of the O line in particular-- has been subpar, even terrible at times. Last year's game, which I blogged about beforehand here, against the Men of Troy was a prime example. Todd Boeckman, a decent quarterback in most regards, lost his job after the USC game due mostly to the fact that the O line couldn't protect him-- and let's face it: when the protection wore down, the guy was slower than molasses in January on his feet. If the offensive line had performed on par with, say, USC's, things would have been much different that game and indeed, the remainder of the season. Had that been the case, right now I'd be speculating on how well Terrelle Pryor would be handling "his" offense in a big game for the first time.

USC D line vs OSU O line
USC's D line knows all too well about OSU's underachieving O line and will do their best to capitalize on it. True, Pryor does a good job of evading defenders when things break down, but he can't be relied upon to lead the offense to a win unless broken plays are more the exception than the rule-- and that will most likely not be the case. Look for Pryor to spend a lot of time scrambling. Thankfully, he's good at this. Against Navy last week, it was apparent that he's in much more of a pass-first rather than run-first mentality than last year. Rather than running for whatever yardage he can get, he will be trying to evade defenders long enough to complete the pass. In other words, Pryor wants to be Troy Smith rather than Vince Young. And that is a very good thing.

OSU D line against USC O line
The matchup that will be more favorable to the Buckeyes will be their D line against USC's O line. USC still has the upper hand here, but things will not be as lopsided as USC's D line against OSU's O line. OSU's D line has the best chance of keeping the momentum from swinging too far in USC's favor. USC generally uses a pass-first mentality: set up the pass early to establish the running game, then use the run to take pressure off the QB and the passing game. It will be absolutely necessary for OSU's D to prevent this from happening and keep USC's offense from getting into a rhythm. The best way to do this, in my opinion, will be to pressure Barkley: bring the DE's in and ring his bell or at least force him to make decisions he doesn't want to make. The results of this will be either incompletions or, hopefully, turnovers. Don't let him get comfortable in the pocket. Shut down the passing game and force USC to the run. The problem here is that while OSU's D line is good, they will be facing the same O line that dominated them one year ago.

OSU Offense
OSU under Jim Tressell has historically been of an offensive mindset opposite USC under Carroll: establish the run to set up the pass. Run-run-pass. Tressellball and all that. I will not be surprised if OSU steps away from this approach tomorrow. This will take away the predictability that USC will be prepared for and will also not rely on OSU's ground game, which is currently less than stellar (primarily due to the play of the O line). Herron and Saine are good running backs, but they aren't bruisers that can run between the tackles and bowl defenders over like Beanie Wells was. These guys need good blocking and a bit of space in order to shine, and the offensive unit has not been reliable in providing this.

This may very well be Duron Carter's coming out party. I have been excited about the kid since I saw him in practice and then against Navy. He is a true freshman starting in his 2nd game, but the kid has got talent. He is already showing flashes-- he has good hands and, more evidently, he's got moves. Although he hasn't seen much playing time yet, I believe he's the real deal. He won't be seeing much play time behind Small, Posey, and Sanzenbacher, but look for something special when he is on the field.

USC Defense

I don't need to tell you that USC's D is the real deal. Although they have replaced three future NFL stars at LB, they will likely not miss a beat. USC is a team that reloads like nobody's business on both sides of the ball. Last year, I rightly predicted that Rey Maualuga would be the standout defender. This year, the man to watch will be SS Taylor Mays. Mays is famous for being both very fast and a hard hitter. He has also let it be known that he will be looking for Pryor. Mays is at his most dangerous when he is in the open field: reading the quarterback's eyes or closing on a defender. Perhaps his only "weakness" is his ability to cover. Running receivers straight at him may be the only way to keep him at bay.

Prediction Time
This will be a huge battle for momentum. Both teams will come out swinging hard. The Bucks know that USC is a team that cannot be allowed to settle into a rhythm. Once that happens, they are almost unstoppable--unless they can be outscored, which isn't likely. Ohio State will look to set up the passing game using short, quick passes. Get the ball out of Pryor's hands quickly so that the O line won't be so heavily relied upon early in the game. Build up confidence in that way. The pistol formation will be instrumental in this. Once the USC D is forced to back off a bit and focus more on covering the Buckeye receivers downfield, that will allow for the ground game to open up. Pryor will no doubt do his share of the running, but designed running plays will probably not come up right away-- at least not until the offense has scored a couple of first downs, if not later.

USC will also take the field and start by hitting Ohio State in the mouth. They will not hesitate to do some aggressive things to get that all-important momentum. The element that will probably not be present until the offense gets into a rhythm will be the deep passing game. Barkley, for as talented as he is, hasn't proven that he can throw the deep ball yet and Carroll probably won't ask him to do so until he has gained a bit of confidence. Once that time comes, OSU's newly-rearranged secondary will be tested.

Bottom line: he who wins in the trenches likely wins the game. I say likely because Pryor, as I said, can at least make something from nothing when protection breaks down. If the Bucks can at least keep a 40-60 balance at the line of scrimmage against USC, they will be OK.

Ohio State may put up a quick score or two early. But unfortunately, I think USC is just too talented, experienced, and confident coming into the game. Look for them to gain that all-important momentum in the 2nd quarter. Once that happens, the air will be taken out of the Horseshoe and the 12th man will evaporate. Tressell will make few, if any, changes at the half and the 2nd half of the game will be all USC-- excepting a too-little-too-late OSU rally late in the game. OSU will rally and either have too little time left or lose momentum back to USC. Carroll is not the kind of guy who will call off the dogs when his team has the game in hand. His teams play every game like they have something to prove.

USC 38, OSU24.

I hope that in approximately 26 hours' time, I will be eating my words. Go Bucks!

Sunday, July 19, 2009

Penumbra

This weekend I picked up a copy of the Penumbra Collection. The three-part game is on sale this weekend and, since the developers were good enough to produce a native Linux version, I was eager to support them by making the purchase.

I paid for and downloaded the game, ran the installer, and launched it. This is when I ran into an issue. As soon as I launched the game, my monitor (actually the TV in the living room) went dark and gave me a message about the signal being unsupported. I had audio and heard the game intro playing, but I couldn't escape the window. My only option was to reset the computer. Great, I thought, what's wrong now? Driver issue? Do I need to tweak xorg.conf? I set about digging for an answer.

The first thing I did was install and enable dontzap so that I could at least restart X without resetting the whole computer, which did help as I ended up launching the game a few times as I tried to troubleshoot.

As an alternative to restarting X the good old fashioned way, I also did the following: when the game started and I was left with no display, I dropped into a virtual terminal (ctrl+alt+F1), ran the top command, found the PID for the game, and killed it (sudo kill -9 [PID]). However for whatever reason, I was still unable to get back to my graphical terminal after doing this and had to restart X anyway by running sudo /etc/init.d/gdm restart. So, a simple ctrl+alt+backspace was still the way to go.

Anyway, on to the information hunt. A few google searches as well as a look through the devs' Linux support forum turned up a few related or similar problems, but nobody seemed to be running into the exact problem that I was. The ones that were similar were all running Intel or ATI graphics cards (I have NVIDIA).

Normally, my first trick in troubleshooting this kind of thing is to run the program from the command line and see what errors it throws. However since I was running into a situation where I had to restart X or take the entire system down, anything displayed in Terminal would obviously be lost. So, I decided to dump any Terminal output into a log file by running:

/home/rick/PenumbraCollection/Overture/penumbra > ~/Desktop/penumbra.log

The log file contained only this:

Penumbra: Overture exited unexpectedly, please check
/home/rick/.frictionalgames/Penumbra/Overture/hpl.log
for any error messages
Also try running
ulimit -c unlimited
And re-running Penumbra and try and recreate the error
then submit the generated core file or stack trace

This output was generated only when I killed the process. I wondered to myself how there could not be any errors thrown when I launched the game. Then it hit me: there were no errors thrown because the game was running just fine! The problem was with my monitor-- the signal was unsupported, but it was receiving something. I looked in ~/PenumbraCollection/Overture/config/default_settings.cfg and found what I expected in the form of the following line:

Screen Width="800" Height="600" FullScreen="true" Vsync="false"

Cripes, I thought. It was simple all along! The 1080 TV I'm using doesn't support 800x600! Insta-facepalm. I edited the line, replacing 800 and 600 with 1920 and 1080 and viola! I had picture when I launched the game again. The problem was far more simple than I had imagined. Now I know what I'll be doing for the afternoon...

Sunday, July 12, 2009

Upgrading to VirtualBox 3.0

Today, as sometimes happens, I ran into a situation where I needed to run a Windows-only app. Having recently moved over to a new laptop, I didn't have an existing virtual machine to use. What I did have was an old installation of VirtualBox 2.1

Having recently read about the release of version 3.0 of the venerable VM host, I figured now was as good of a time as any to upgrade. After modifying sources.list and importing the apt-secure key, I initiated the download:

sudo apt-get install virtualbox-3.0

I checked back a few minutes later and much to my shagrin, I was greeted with an error message about the kernel module failing to compile due to the kernel headers not being present.

This, I realized, was the first snag I had run into due to upgrading to the upstream 2.6.30 kernel to fix issues I had been having with poor 2D acceleration in Jaunty (reference).

Not prepared to give up, I headed over to the Ubuntu kernel repository where, thankfully, the 2.6.30 kernel headers were available. I grabbed and installed the appropriate deb, and ran the familiar:

sudo /etc/init.d/vboxdrv setup

And was greeted with another error:

* Stopping VirtualBox kernel module
* done.
* Recompiling VirtualBox kernel module
* Look at /var/log/vbox-install.log to find out what went wrong


The log gave me this:

Error! Your kernel source for kernel 2.6.30-020630-generic cannot be found at
/lib/modules/2.6.30-020630-generic/build or /lib/modules/2.6.30-020630-generic/source.


Silly mistake on my part this time. I had downloaded the kernel headers, but not the kernel source. I grabbed and installed the kernel source deb, and this time the kernel module compiled without a hitch.

I am installing a WinXP guest machine now.

Saturday, July 11, 2009

Fallout

Recently I blogged about my misadventures with a full partition that contained my $HOME folder. I noticed not long after that episode that I still had some strange behavior, such as various program preferences not being saved and issues copying some files during a data backup.

After doing a bit of sleuthing, I concluded that in addition to the issues I wrote about at the time, my $HOME permissions had been altered. I did a bit of digging and from a few different forum threads, I plucked out a few commands that helped me to restore things back to their normal order. What I had to run was the following:

sudo umount ~/.gvfs --> unmount the GNOME Virtual File System config so that I can...

rm -r ~/.gvfs -->
...delete it, to allow...

sudo chown -R rick /home/rick -->
...everything to properly be chown-ed by me.

And finally:

chmod 755 ~ -->
Set proper permissions for ~ (Read/Write/Execute for me, Read/Execute for everyone else).


Yes, some of these things can be done through Nautilus, but that method is not recommended. The reason being that Nautilus does not always handle permissions as gracefully as the trusty command line.

So there you have it, a two-part post on what how to remedy a broken system. Again, the upshot is not to let this happen to begin with!

Sunday, June 14, 2009

Diving into SQLite Using Python

...Or, The Trouble With Tuples.
(Sorry, the pun had to be made).

For the past 3-ish months, I've been teaching myself Python. I started off with Wesley Chun's Live Lessons video tutorials, and later moved on to his book Core Python, 2nd ed. It has been a satisfying ride so far. Not without tribulations of course, but things are coming along nicely.

Once I had learned enough of the basics, I jumped into writing some programs. Simple things at first of course such as number guessing games and the like-- at first taken from textbook exercises, but later also incorporating various other amateur programming challenges I found on the web.

My most recent project is a database manager application. Originally it was a response to a programming challenge posted on the Ubuntu Forums, but slowly developed into a larger and more powerful app as I decided to add more and more features not called for in the assignment.

Development of the app moved along at a steady pace until I implemented record deletion. I could successfully search the DB using a parameter entered by the user and edit the record, but when I tried to delete it I was met with the error:

ValueError: parameters are of unsupported type

This was something I had not previously encountered. After exhausting my available resources, I decided to ask for help on the Ubu-forums. I started a thread (full details about the program and the solution can be found therein) and got my answer in short notice. Essentially it was this:

I had already successfully implemented DB record editing with the statement

cursor.execute('UPDATE main SET FName=?, LName=?, age=? WHERE id=?', (dataFName, dataLName, dataAge, record))

This part of things worked without a hitch. But when, in the same function, I ran

cursor.execute('DELETE FROM main WHERE id=?', (record))

I would get the "unsupported type" error.

The problem, I learned, was with the variable "(record)" that I was passing to the SQLite statement. The data passed needs to be of type Tuple and I was not providing one. I was providing a mutable string!

I was puzzled by the fact that the edit statement worked and the delete statement did not. It dawned on me that I was inadvertently creating a Tuple in the edit statement-- this was completely a by-product of me passing multiple variables across. It just happened to be creating the tuple I needed without me realizing it. With that in mind, what I had to do was make a very small change to the delete statement in order to create a tuple. I added a comma after record so that the statement read:

cursor.execute('DELETE FROM main WHERE id=?', (record,))

And that was all it took! Lesson learned. Things are moving along nicely with this hurdle out of the way and I hope to soon be finished with the app.

Sunday, May 31, 2009

A Full HDD = Trouble: A Cautionary Tale

This weekend I learned an unexpected lesson in system admin: what happens when the partition containing my /home/ folder runs out of free space.

While doing a re-installation of my system after hosing my GRUB, I backed up a few GB of documents from a location that was to be wiped out to my home folder. That extra disk consumption, combined with a large file download I initiated after brining my system back online, was enough to fill up the HDD containing the partition where my /home/ folder is located (for ease of re-installation in the event of a problem I mount a second partition to my /home/ folder).

The first thing that happened was I noticed that the file I was downloading suddenly registered a nondescript "error" at about 45% completion. After trying to re-start it a couple of times and having no joy, I decided to restart the system. Bad idea. Upon logging back in, I noticed a lot of screwy behavior: Docky, which I use in place of a bottom panel, would not start and thew a lot of errors when I tried to launch it from the command line, my Compiz settings all reverted to default, Firefox launched upon login despite me not configuring my session as such, and other seemingly unrelated problems.

After a few logouts/restarts did not alleviate any of the problems I had run into, the wheels started turning. I realized quickly what had gone wrong and confirmed in Nautilus that I had in fact run out of space on my /home/ partition. No problem, I thought, I'll free up some space by emptying the Trash. Well, guess what: when the disk is full, it is not possible to do this (I guess GNOME needs some disk space to perform the operation). When I tried to empty the trash I got a progress bar and the message "perparing", but the operation halted without anything actually being removed.

Ultimately, I was able to get out of this pinch by moving a few GB of data from /home/ to / via the command line. Once that space was freed up, I was able to empty the trash, re-arrange a few things, and everything went smoothly from there.

Everything appeared ok until I tried to load up Banshee and play some music. I loaded a playlist and clicked play, and playback halted after the first five items failed to play. I launched from the command line and tried again, at which time Banshee threw the error:

GStreamer resource error: NotFound

A bit of googling suggested that Banshee's DB had become corrupted. This proved to be the case as I was importing some new media at the time I ran out of disk space. This must have happened just as data was being written to banshee.db.

So, as suggested on a few sites, I backed up and deleted the file ~/.config/banshee-1/banshee.db. Sure enough, I was able to re-import my music and successfully play it again.

Unfortunately, playlists did not go so smoothly. Before deleting the original DB I exported each playlist to an M3U file so that I could import them again after re-creating the music DB. I told Banshee to import the playlists and watched as it loaded each one-- with zero songs in each. This was a painful discovery: with about 10,000 songs in my library spread across 15 or so playlists, there was no way in hell I was going to go through re-creating them all. I was determined to find a way to fix this.

Luckily, I found one. When examining the M3U files in gedit, I noticed that for some reason Banshee had not saved the file locations properly. Rather than an absolute path of /home/rick/Music/artist/album/song, the playlist entries were /home/rick/artist/album/song. I have no idea why the /Music folder was completely left out of the file path, but I verified that this is what happened to each and every one of my playlists. A few quick "find and replace" operations later, my playlists were back in order again, and the episode had finally come to an end.

My takeaway from this? Pay attention to disk usage and don't run out of space again!

Now I just need to find out if the Banshee team are aware of the problem I ran into when exporting my playlists...

On Importing Wordpress to Blogger

One thing that held me back in moving to blogger was the lack of support for importing my old blog posts from my Wordpress blog. It's no secret that Wordpress makes migrating from blogger, but blogger doesn't make it easy to migrate in the other direction. A bit of googling turned up a few scripts and sites that supposedly could properly format my Wordpress xml file for importing, but none worked except for wxr2blogger, which finally make the task possible.

I was not able to accomplish the import after using the online converter for some reason (blogger would choke on the file, despite it being only about 53 KB in size) but thankfully, the command-line version did the trick. I ended up with 6 separate xml files, each containing a "chunk" of old posts. These imported successfully. For some reason, a couple of my posts did not show up in my dashboard, but it was easy to re-create them by snagging the HTML of each post out of the Wordpress dashboard and just re-posting them here. So, for anyone who may find this post while searching for a way to migrate from Wordpress to blogger, I recommend wxr2blogger.

Sunday, May 17, 2009

Update Manager Errors - Resolved!

For a months now I've been getting a pesky error from Update Manager when checking for updates:

W:Failed to fetch http://archive.canonical.com/ubuntu/dists/intrepid/Release Unable to find expected entry main/binary-i386/Packages in Meta-index file (malformed Release file?)

This is probably something that most linux users see in some form at one time for another and while I had googled it a few times, I had not managed to resolve it. This was OK because while a bit annoying, it was not a show-stopper as I could simply dismiss the error and continue on my way.

That was, until today, when I finally got around to upgrading to Ubuntu Jaunty. The problem went from being a nuisance to a genuine problem as it caused the dist-upgrade to fail.

Determined to find a way around it, I again began by searching the Ubu-forums for simply the tail end of the error (beginning with "Unable to find..."). This time, I found a thread where someone actually had an issue similar to mine and had managed to find the issue. The problem in my case was two incomplete lines in my sources.list, namely:

deb http://archive.canonical.com/ubuntu intrepid main
deb-src http://archive.canonical.com/ubuntu intrepid main

As one person on the forums put it, "They should all end intrepid something". Sure enough, commenting out these two lines resolved the issue! How I ended up with two incomplete lines, I have no idea: as a rule I try to avoid editing sources.list by hand to avoid issues such as this one, save when adding third-party repos (which these two were not).

At any rate, the problem is solved at last and I am downloading the updates for Jaunty as I write this.

Thursday, April 23, 2009

Howto: sync music to your Verizon Wireless cell phone using Amarok!


NOTE:
You may notice that this same tutorial exists on the Ubuntu Forums. That's because I wrote it! This is a re-posting of that tutorial. So not to worry, no one is being ripped off. Scout's honor.




I have found that there is a dearth of information on ways to get music onto cell phones under Linux, so I'm hoping this tutorial will help many of you out there. I have tested this method and verified that it works with my LG Dare from Verizon Wireless as well as my girlfriend's LG Chocolate II (also from VZW). This method will probably work with Verizon's other BREW handsets (just about everything they sell that isn't a smartphone) because the music player software is very similar across the board but again, I've only tested using the two phones mentioned above. VZW phones are the focus of this tutorial.

As to be expected, VZW provide no support for anything other than Windows and leave users of other OSes in the cold (and even their Windows solution is god-awful). My phone never even recognized MP3 files that I manually copied over to the proper folder on the memory card either. Even if that did work, who wants to manually copy over songs one by one? I created a playlist and wanted to send the playlist to the phone without a lot of copying and pasting.

This method may work with other phones as well-- so if you have luck with your phone, please post your results and comments in this thread.

For the purpose of this tutorial, we will assume that you already have Amarok and have scanned your collection into the library. We will be using Amarok 1.4.10, the last 1.x release, since as of this writing Amarok 2 does not offer mobile device support. I have verified everything listed here under Ubuntu Intrepid, but other variants should work as well.

What you will need:
-External Memory Card (sorry, no way to do this with the phone's internal memory AFAIK)
-Card Reader
-AmaroK 1.4.10 (currently available in the repositories)

What you will NOT need:
-USB Cable/driver
-VZW's Rhapsody crapware
-DRM-protected music. Sorry, but DRM-protected files are outside the scope of this tutorial.


Ok, down to business!


Part A: Make sure your memory card is set up properly

1. Formatting
The phone expects a specific set of folders on the memory card in order to handle media properly (my_music, my_pix, my_flix, etc). If you have just bought a memory card, you may need to format it in order to ensure that these folders exist. The easiest way to do this is to insert the card into your phone and then just turn the phone off and back on again. The phone will probably create the folders for you at this point.

2. Mount the card
Insert your memory card into your card reader and mount it, then check the mount point if you don't know it already by navigating to it in Nautilus and checking the address shown (usually something similar to /media/disk/ ). Open the card and make sure the folders mentioned above exist. If not, create them manually.

Part B: Configure your memory card as a media device in Amarok

1. Choose Settings - Configure Amarok - Media Devices

2. Choose Add Device...
You will be presented with 3 options:
-"Select the plugin to use with this device:" -> Choose Generic Audio Player
-"Enter a name for this device (required):" -> Name the device whatever you like
-"Enter the mount point of the device, if applicable": -> Enter the mount point from Part A, step 2
Click OK

3. Click the Devices tab, then click the gears icon (to the right of the Transfer button) to configure additional settings for the device:
-Ignore pre&post-disconnect commands, transcoding options
-Leave all checkboxes at defaults (all unchecked)
-Song location: This part is important. You must tell Amarok to place the files in the my_music folder in the phone. Configure the filenames however you please but make sure that the files will NOT be nested in any folders underneath my_music. For example if you leave it at the defaults, Amarok will create a hierarchy of my_music/artistname/albumname/filename.mp3. If you sync using this hierarchy, the phone won't see your music when you are finished.

The path asked for here is relative to the root directory of the memory card (e.g. /media/disk/ in this example) so you don't need to specify the path to the card. The format I use is simply: my_music/%artist-%title.%filetype.
Click OK

Part C: Sync your tunes

1. Under the Collection or Playlists tab, choose the songs or playlists you want to sync, right-click and choose transfer to device or sync to device.

2. In the Devices tab, Choose audio player you set up in part B from dropdown at the top of the Devices section, then choose Connect.

3. Click the Transfer button and watch your music sync!

4. Once the transfer has completed, click the Disconnect button, unmount the disk in Nautilus, remove it from the reader and insert it into your phone.

5. Start your phone's music player. There should be a one-time initialization where it reads through the track info and loads up your songs. Once you see that, you know the phone has successfully recognized the music on your memory card.

6. Enjoy your music!


That's really all there is to it. Once you have done the initial setup, just repeat Part C to sync/resync additional music as you please. If you find alternative methods or find any issues with this tutorial, please post a reply. I will update this post as necessary.

Troubleshooting:
If your phone does not see your music, check that your
If you have issues, please post as much info as you can, including:
-The symptom you ran into (such as "my phone doesn't see my music")
-Whether you received any errors from Amarok (and if so, what the errors are)
-Wireless carrier and phone brand/model
-Phone software version: where this can be found depends on the phone. For example on my LG Dare, I find this under Main Menu->Settings&Tools->Phone Info->SW/HW Version. My firmware version is VX970V05.

I can't guarantee I'll be able to help, but having worked for the big "V" for 3 years (not anymore, thank goodness) I have a fair amount of phone-savvy and I'll help if I am able.

Notes:

1. The tracks I transferred were all mp3 files with data stored in ID3 tags. I have not experimented with other formats.
2. Album Art embedded in the ID3 tag will be read by the phone's music player automatically! This seems to be the only way to get artwork onto the phone in a format that the phone will recognize.
3. The phone didn't recognize any of my genre tags, although it did pick up on Artist/Album/Song Title.
4. Make sure not to use nesting of files under my_music - they must ALL be in my_music and NO other subfolders.
5. As I mentioned, I used this method to sync a playlist I created from my library on my desktop to my phone, but the playlist itself did not end up being visible in the phone's music player. This wasn't a big deal to me, so I left it at that.

Further Reading: Howardforums.com is an excellent source of information for cell phone enthusiasts. You will be able to find a lot of related info such as how your music player works, other known tricks, etc. Couldn't hurt to read up on your phone!

Wednesday, March 25, 2009

Throwing in the towel

It has been five weeks since I purchased a Hauppauge WinTV-HVR 1800 for my desktop box at home. The goal I had in mind when I purchased it was to take my fledgling networked home theater to the next level: in addition to streaming music and movies from my desktop to the downstairs TV (via XBMC) and all other computers in the house, I wanted to be able to watch/record TV on the desktop (which is in my bedroom) and yes, stream my recorded programs to the rest of the network too.

However, things did not go so well once I installed the card. This is due to the fact that, unfortunately, the state of such hardware under Linux (I is not as good as it could be. As it stands now, few cards support OSes other than Windows in any official capacity -- very few indeed. My Hauppauge card was no exception. Prior to making the purchase I did a good deal of research and found that, according to many reliable sources, it would work thanks to some 3rd party drivers. Even linuxtv.org lists the card as functional.

Unfortunately, I never was able to get the card working properly with analog signals: no matter what I did, I could not get audio. This is a deal-breaker as I have no cable box in the bedroom and don't care to pay an extra $10 to get one, thankyouverymuch. I also need analog because another plan I have is to hook up a VCR to the card and capture old home movies from VHS. I am not the only one who had issues, either. I even went so far as to compile a new mainline kernel that hadn't yet made it into Ubuntu as it includes Steven Toth's V4L-DVB drivers. Alas, no joy.

So tonight, after several weeks and countless hours spent futzing with it, I have thrown in the towel. Tomorrow the card will be shipped back to newegg and I'll begin hunting for a card with which I may have more luck.

Monday, February 9, 2009

A (somewhat) Sad State of Affairs for Linux iPod Users

I have a 30GB iPod Video (5th generation) that I use with Ubuntu Intrepid. My music collection is larger than the capacity of my iPod, so I can't simply sync everything-- what I do is sync specific playlists. This, coupled with the fact that I view album art support a must, creates a specific need that not every program can satisfy.

I've tried out most of the popular players and IMO, the only one that really does everything properly is AmaroK 1.4. Here's why the others don't live up:

Songbird: Can effortlessly sync my playlists, is (AFAIK) the only program that can "restore" your iPod to factory defaults like iTunes can, BUT has no support for artwork (yet). This is a deal-breaker for me. Another strike against Songbird in my book is the fact that it ignores use Gnome's window manager and uses its own.

GTKPod: I have to admit I hate this one. The things you have to do to get stuff on the iPod are just silly. The interface is not intuitive. Also there seemed to be no easy (possible?) way to just sync the playlists I want-- I can transfer a whole playlist there, but when I update the playlist there is no quick way to sync the changes to the iPod. It does support artwork, however.

Banshee: Supports artwork. Does a pretty good job overall but again, no real "sync" feature. Like with GTKPod, I'm stuck having to drag and drop things I want and can't just sync changes to my playlists. This one is a bummer because Banshee is my preferred app for library management & playback on my desktop.

Rythmbox: Basically the same as Banshee with respect to syncing. Didn't bother to check if it will put the artwork on my iPod.

...which brings us to...

AmaroK: I'm not a big fan of KDE apps in Gnome due to the differences look and feel (especially ugly KDE 3.5 apps) but AmaroK 1.4 just does everything I want, the way I expect. It supports selective syncing (e.g. my playlists, even when I update them) and puts the artwork on the iPod. AmaroK 2 doesn't have device support yet, so I'm sticking with 1.4 for this sole reason.

If not for AmaroK 1.4 I would have to rely on a Windows app under WINE or VirtualBox, which would just add another layer of complexity to the whole thing.  The biggest annoyance here would be the fact that I would have to edit my playlists to be Windows-frienldy (change every slash to a backslash in the playlists using a text editor) then save a "Windows" copy for the syncing player to use.   This is on top of my current process which is:

1. Edit playlist(s) using Banshee, my preferred app
2. Export the playlist
3. Import the playlist to AmaroK
4. Sync

So even as it stands now, I have to do a couple more steps than I'd like (if only Banshee synced the way I like, I wouldn't have to bother with exporting/importing the playlists like I do).

"Why not use AmaroK for your music playback?"

Not a bad idea, and I actually used to do this-- but now that I have a burgeoning video library in addition to my music  (for use with XBMC downstairs or playback on the computer in my room), I like that I can use  one app for both audio and video.

So, seeing as 2009 will certainly be the year of Linux on the Desktop, I hope that this will improve with time.  In the case of Banshee, which is currently under active development and becoming more and more popular, I'm sure that will be the case.  I eagerly await future releases of what has become my player of choice lately.

Wednesday, January 28, 2009

Reading Material

Due to the business my company is in, publishers often send promotional copies of books in hopes that we'll republish their content. Some of what we receive are technical books. Recently, as sometimes happens, our marketing department was good enough to give away books that are no longer needed -- whether they were used or not. I was lucky enough to score several technical reference books. One of them was Mark Sobell's A Practical Guide to Ubuntu Linux, second edition.

After just a short time, I'm already 120 pages in and have learned a lot about what makes the OS tick and what forces drive GNU and Linux in general. It has been an invaluable resource and is already helping me to better administer my desktop PC, where I have been using Ubuntu Linux for nearly two years (since just after the Feisty release). If you have the means, I highly recommend this book. It is well worth the while!