sql vs. sql

Once again, UPHPU has had a minor stir about which database is faster / better / stronger, PostgreSQL or MySQL. All fanboyism aside, who really cares?

You want to know the way to *really* speed up your database? Normalization is probably going to be the largest factor. After that, use views, stored procedures, indexes, transactions and well-written queries, and your database is going to fly amazingly fast.

At work we have a large server we call “Zeus” because it is incredibly large. I won’t even go into specs because you wouldn’t believe me even if I told you. When I first started working here, the database running on it was incredibly slow. At first I blamed it all on the database software we are using (you can search my blog if you really wanna know which one it is. Hint: it’s neither of the two mentioned above), but as we cleaned up the databases and tables by removing columns that were complete cruft and then doing everything I mentioned above, this puppy flies. In fact, our “dev” database, which is running on nothing more than an Athlon XP 1800+ runs just as fast as our beast-monster does.

That’s how you get a fast database — doing things the right way. Who would have thought?

I have to apologize for the elitist feel of this post, but my point is this … the only magic bullet in improving performance is going to be quality code and design. Just replacing your database with something else isn’t going to make the speed fairy sprinkle your application with love.

a small comics trip

This entry is going to be a short little ditty since I’m at home being sick, and I can’t really think straight. I did want to at least get a small note up about this, though.

I read Scott McCloud’s online webcomic of “Hearts and Minds” today. I’ve been meaning to finish reading that for a while now. My timeline might be whacked, but I swear I remember when he first put that online, though that doesn’t seem to jive just right, since I would have been in Argentina at the time. But, whatever.

It was great stuff, though. I used to have the Zot books, volumes one and two, once upon a time. I probably sold them off in one of those phases of life where I was either completely broke or just cleaning out everything. I wish I had a copy, now.

That got me thinking about how much I used to be into comics, though. Man, was I ever. My favorites are the old EC Comics reprints of the sci-fi and horror stories. If you read too many of them though, the lettering starts to really bother you.

I grew up on comic books though. The first graphic novels I started reading were the Sandman series (gosh, that was a long time ago), but those were a little too graphic for me — I have a really low threshold for gore and bloody violence. I haven’t really picked up any since. I should find some.

I’ve got a box full of some old Gold Key comics I started collecting about a year ago. The Walt Disney ones are great stories. Nothing they’re putting out today compares, although that’s not saying much. Even for their time, though, the stories were better than anything else they had going in any other medium (I’m talking about movies).

I also discovered some Twilight Zone comics which are great, pretty much in the same vein as the EC comic books, but these are actually a tad more extreme in some cases. Still, they are a real treat. Great stories, cool colors.

I gotta get back into getting some comics again. The great thing about it is it gets me into *reading* which I never do, anyway. There’s just so much good stuff out there, too. Craziness.


For the first time in my life, I saw an episode of Nova on PBS last night. It was great. Nova is one of those things that I always knew I would enjoy if I ever sat down and figured out when it was on and then watched it. Hey, it only took me 18 years to get to that one.

The show was pretty awesome, though. The title of the episode was “Elegant Universe” or something like that, and it was about how Einstein wanted to believe very badly that deep down the universe can be explained in an orderly fashion. The show covered all kinds of stuff like gravity, electromagnetics, string theory, physicists from the early 1900s, and quantam mechanics. Very cool stuff.

For the record, I’m a real PBS junkie, too. I’ve got basic cable at home (only $12/mo, woots) and I get about thirty channels or so. Thankfully, Utah has about four public television shows (UEN, KBYU, KUED, and I know I’m forgetting something else … BYUTV maybe?) and there’s always something worth watching on one of them. It’s great.

Here’s some random stuff I picked up:

  • Einstein was a bit of a star in his younger days
  • String theory believes that there is an explanation that covers everything (I think)
  • Gravity is very weak compared to electro-magnetism, and on the atomic level is almost completely non-existant
  • Quantum mechanics deals with measuring the probability of certain outcomes, since nothing is ever certain.  Einstein refused to believe that because he couldn’t accept that on the smallest level that things were disorderly.

do you ever get the feeling …

… that no matter how much stuff you know, there is still a ton of stuff you don’t know?

Case in point, I’ve been working on writing ebuilds tonight, and it can be some incredibly difficult stuff.

First I started working on an ebuild for the wis-go7007 kernel drivers. Talk about biting off more than you can chew. I have to use the kernel eclasses, and I was looking at nvidia-kernel as a reference, but I couldn’t figure out why it was using eerror to get the output of linux_chkconfig_module.

The next one that I worked on (dev-libs/libebml) was changing something in the Makefile for Mac OS, and I had no flipping clue what it was changing or why. I’ll have to ping flameeyes on that one when he gets back.

Another one (media-video/oxine) has a lot of stuff to check for to make sure the deps were built correctly, and I couldn’t think of a good way to exit out if it was missing some without setting a bash variable.

On a good note, the one for mkvtoolnix came together great. That, and I fixed my alsa settings so my speakers don’t clip anymore.

I’m still bummed, though. I really need to learn more bash, it seems. I’ve been putting off learning other languages for a very long time now, since I’ve been so proficient in PHP, and now it’s really starting to bite me in the butt for not branching out.


I’m gonna go get on my laptop, sit on my couch, crack open my awesome C++ book, and get learning. I figure if I could learn both extremes (C++ pretty hard, PHP pretty easy) then the stuff in the middle should make a lot more sense, right?

random passwords

I needed to write a random password generator today, and instead of bothering to search for some code that I could copy and paste, I just whipped up my own real quick.

Then I got thinking about how all those Perl monks brag about how they can cram all this code into as few lines as possible, so I crammed mine into just one line. :)

Here it is:

for($x = 0, $password = ”, $password_len = rand(8,24), $alphabet_range = range(‘A’, ‘z’); $x < $password_len; $x++) { $password .= $alphabet_range[rand(1, count($alphabet_range) – 1)]; }

The great thing is it spits out cool passwords like this:

  • luhRDKyMDX\oey
  • rbKGTt`dLHWkHQZ
  • LJyeO_UwTpQzIni
  • ltonXGkho[e\JWbBl^uyEL

Fun stuff. :D

more cowbell: planet larry

Introducing Planet Larry, an aggregation of blogs of Gentoo users. :)

A few of us in User Relations came up with the idea to have a planet feed that pulled subscriptions of blogs from our users that have anything to do with Gentoo. It’s not an official project, but that’s okay, it’s still just as cool. What we’d like to do is get anyone who writes about computer-related stuff and also uses Gentoo onto the feed. It’ll be a good way to see what you guys are up to, and at the same time hopefully bring the community together a little bit more.

So, if you use Gentoo and you blog, e-mail me your name and your blog’s website at beandog at gentoo dot org, and we’ll get you added. It’s that simple. :)


Man, I almost forgot! Star Trek II: The Wrath of Khan is playing on the big screen tomorrow night, at midnight, at the Tower Theater in Salt Lake.

I love watching movies on the big screen … especially when they are classics rolled again. I’ve seen such great stuff as “Watcher in the Woods,” “Superman,” “White Christmas,” and “Abbott and Costello Meet Frankenstein” all on the big screen. You just can’t pass up an experience like that. I know there’s a lot more, I just can’t remember them right now.

So, if anyone else goes, just look for me … I’ll be wearing my Starfleet Academy t-shirt. :D

onto live-action next, baby!

My cartoons are almost done. I took a look at the queue last night, and there were 99 episodes left…. 97 of them all on one machine. I still haven’t coded it so that you can (easily) reassign the episodes in the queue from one machine to another. All it would really take, though, is moving the files from one machine to another and then a funky sub-select query to update the queue. I’d go into details, but it would be extremely boring. Needless to say, I just need to get off my duff and finish polishing up the frontend admin.

Anyway, since the cartoons will finish encoding today, now I’m left with a completely different beast — live-action shows. The same generic mencoder settings I’m using for cartoons work fine on these as well, but they don’t look quite as nice. It’s probably going to be back to the drawing board for a few days as I resume testing and research.

A while back, I thought I had the perfect solution. Since all I really needed to do to fix them so that transcode could process them was fix the variable framerate, I would just encode them, copying the audio and video, and then forcing the output fps. That actually worked great until you go to encode them *after* that.

To be honest, I’ve only tested that idea on two series so far, and they were both on TV shows from Universal Studios (A-Team and Murder, She Wrote in case you were curious). I don’t know if I’ve mentioned this before, but Universal makes pretty much the worst discs out of the entire lot. The Incredible Hulk is the first series I’ve gotten from them where I could actually play all the discs. And I’m not just talking about my computer can’t play them, they send the DVD firmware for a wild loop as well.

Depending on which DVD drive I’m using, some of them might be able to actually pull the entire file off. My Pioneer and Lite-On drives work great, while the Sony one would lock up my computer completely, and I’d have to hardboot it. In fact, I’ve been pleasantly surprised by the results with my Pioneer drives. They throw a lot of error messages to the system log, but if it really freaks out, all I have to do is forcibly open the tray, and the drive actually resets itself! Normally when I have to rip a disc out myself, the drive won’t be able to read until I reboot the computer to kind of clear the drive’s head. The Pioneer drives, however, still work. After I take them out, the open/close button still works, and I’m able to keep on ripping. I’m impressed.

So, my sampling of testing the worst discs in the bunch may not be the best approach. When I tried the A-Team one, it worked fine, but you could tell where it was switching framerates. Well, I need to clarify a little bit. The first copy/copy pass where I just used -ofps, *that* video looked great. It was when I transcoded from that that it looked like crap. Part of the problem might have been that I had to reindex the video because the AVI stream was broken.

One thing I could do is do a near-lossless encode on the audio or video so it at least processes the media file, albeit lightly, so the index is working without having to rebuild it. The audio would be pretty simple … just use ‘pcm’ as the output codec, pretty much copying it as a wav file. The video would be a little harder. I suppose I could export it as an MPEG2 stream again, and see what that does. Hmm, I hadn’t thought of that. Another idea is to give up on the AC3 stream and encode to the audio to MP3 or Vorbis or something. Doing any kind of encoding will keep the index intact. I’m just trying to find a solution that will leave it as near the original as possible.

I think that broken AVI index is what’s causing the awful artifacts though. Well, either that or the fact that it’s dropping (or duplicating, depending on what your target fps is) frames on the first encode. What happens though is you take that re-indexed video with the forced standard framerate, and *then* you run it through transcode. Again, I haven’t done much testing at all, just tried a two-pass encode on two different files. On Murder, She Wrote, the second file I tested, it actually looked really good for about the first twenty minutes. Then the entire thing was offset horizontally by about 60%, and there was a nasty green overlay. I wish I still had that original transcode, I could have posted a snapshot.

Whoops … one thing I just noticed is that my little reindex script was outputting the framerate to 29.97 (30000/1001) instead of 23.97 (24000/1001). That definately wouldn’t work well. I wonder if that’s how I was doing it on those first tries. I don’t think I was, but that might explain quite a few things. Time for more testing!

Well, this post ended up much longer than I had intentioned, but it definately got the wheels turning in my head a little bit. I’ve got a few ideas now that I can’t wait to try out. There may just be a perfect solution yet. :)