Category: "Linux"

E-mail! E-mail! E-mail!

December 23rd, 2010

Yesterday I worked on the email server that I have. As a prerequisite to understanding email and email server management an understanding of how much time it takes to do anything with an email server is "LARGE". So I was thinking to myself that I should work on the spam filter so that it could be detecting Ham and Spam according to the user account. In my head I was thinking that the change wouldn't take too long which is an error thinking according to the second sentence. So I proceeded and I had to read up on how to get spamassassin to start detecting according to the user. My first attempt I read a little article on the old Gentoo wiki archive website and attempted to implement. My first failure because the bayes algorithm didn't load. So then I had to keep reading documentation and after about an hour of doing more poking around in the documentation I learned that I actually needed to get a file to setup the database tables. So I looked for the particular file on my computer with no luck. I then searched the Internet and found the file at the repositories for Apache. I then run the SQL script to create the tables. Now I started up spamassassin and the result was good except now spamassassin didn't know what user account each email belonged to. Spamassassin was sending all of the emails to be learned to the "mail" account which is not how I wanted that to work.

The most frusting part of the whole email changes includes creating the regexp for getting the user account. To get the email account information I needed to do a search and find just the username which was embedded in a bunch of directory stuff. I got the regexp to work and find just what I wanted on the command line and I thought to myself, "victory!" I couldn't have been so wrong because as soon as I attempted to put the regexp into the script, "failure!" Now I spent two hours poking around trying to get the regexp to work inside the script. I finally did succeed without learning a lot of bashisms. Well I was able to get the changes into the two scripts that run each night to learn Spam that is in the Spam folder and learn Ham that is in the Ham folder. When I was making the changes to the two scripts to learn Spam and Ham I noticed that they hadn't been learning new Ham and Spam for the past three or four years. The scripts were written back when the server was setup for Qmail and then after a couple of years I decided to move to Courier as the email server. Courier changed the folder which the email was located in and I had missed that particular change when I had to make the other changes to the scripts. Now the Ham and Spam scripts correctly read the Ham and Spam folders which is nice. I then had to modify the maildroprc file to make the incoming email to learn correctly according to the user account. Maildrop, I feel is not documented in the clearest fashion available. I spent two hours trying to figure out what the predefined variables were and what their assignment was. I figured out their assignment and essentially implemented the same filter that I used in the scripts. Now Spam and Ham were being learned correctly and getting inserted into the database.

I was still left with the web email clients not working correctly. I had Horde installed but I figured that I would install a second client called Squirrelmail. The logic is that if one breaks the other will still be working. I was able to get Squirrelmail installed fairly easily and setup. I then needed to modify my website for the changes to the email options. Making the website changes took me a little bit because I didn't completely remember how to do the tables. I coded the website myself but I had forgotten how the XML was setup. I also wanted stylesheets so that I could do fancy things with my tables. After I got that to work with the website I went and turned off the PHP warnings and that fixed both Horde and Squirrelmail. Now there is a choice and the email filtering is working correctly.

With the Ham and Spam scripts working I won't be getting the busty babes in my email box. I won't be missing them at all, shucks. I really don't like Spam anyway because it adds so much noise to the email box. No more online diplomas or bank transfers too.

Computers

July 24th, 2010

Every computer has a problem with it. Everything you buy isn't well tested in all the possible areas and there is something broke. Some things can be easily covered up by software and other things can't be covered up so easily. The problems that I'm referring to are the ones that can't be fixed by software.

I myself have five computers ranging in age from ten years old to being four years old. I just remembered that I was given a computer and its age is about thirteen years old and it runs on an AMD K6 233MHz processor. So I feel that I should give a list of each of the hardware problems that just don't get fixed on each of the computers.

  • Computer 1 - AMD K6 233MHz processor
    1. The computer has problems with bus mastering.
  • Computer 2 - AMD Athlon 650MHz processor
    1. The computer chipset has a problem with signal integrity when the AGP is run faster than 1x speed with particular loads on the AGP bus. Makes the computer lockup when loading X.
  • Computer 3 - AMD Athlon 800MHz processor
    1. Same as Computer 2. This computer has a motherboard with hardware monitoring and is more stable and is used as a server.
  • Computer 4 - AMD Athlon 64 X2 3800+ processor with ECS mainboard
    1. This computer has a network card that flakes out with interrupts. The problem is very sensitive to having different video cards in the computer. I installed an add-on card to work around this bug.
    2. The DVD burner flakes out too after a while and it goes into a power saving mode.
  • Computer 5 - AMD Athlon 54 X2 4200+ processor with ASUS A8N-SLI Deluxe mainboard
    1. The onboard sound sucks really bad. Whoever layed out the board didn't properly ground the analog audio portion and so now there is coupling of I/O signals onto the rear output signals. It is really annoying to listen to especially with my Altec Lansing speakers. I bought a Sound Blaster X-Fi Titanium to resolve my sound issue.
    2. The second onboard gigabit ethernet doesn't work with 4G of ram installed. The interrupts aren't mapped correctly by the BIOS or the ethernet hardware wasn't engineered correctly. Apparently this particular Marvell chip is broke with 4G of ram regardless of vendor motherboard.

So There you have it. Every computer I have has something that is buggy and requires some sort of work around. Sometimes the work around requires more hardware and money and other times it is just a nuisance and can be dealt with by running something slower.

The Hard Drive Saga

July 24th, 2010

So I processed the fireworks video the week after I watched them. However there was a big problem that occurred. I had the videos ready to be put onto the website and I had the video converted into a lossless h.264 file which took a day to compress. Then when I went to plug the drive into my main computer I first tried to plug the drive in correctly on the back of the case. My ASUS A8N-SLI Deluxe motherboard came with an external connector for plugging in SATA drives. The panel doesn't meet eSATA spec but I use it because of the convenience. Well my first attempt to plug in the molex connector was not successful and so I rotated it around and went to make another attempt to plug in the drive and then suddenly the computer turned off and there was a spark. Well the computer wouldn't turn on and the drive was plugged in. My second attempt was an attempt of plugging the the drive in backwards which was unsuccessful. I got it plugged in correctly and the computer wouldn't turn on. The power supply on the computer smelled like something got hot too. Well I thought that maybe the computer was dead but I unplugged the 300G hard drive and the computer turned on. I was glad that the power supply had short circuit protection. The short circuit protection saved me a lot of money. So far no smoke had been released. I plugged the 300G hard drive in again and immediately the computer turns off again. Still no smoke had been released.

Next step unplug the drive and try it in the other computer. The other computer was on and SATA is hot pluggable. I plugged in the hard drive and then smoke started spewing and the power supply of the second computer began to spin the power supply fans up to full speed. I unplugged as soon as I could think of doing this. The whole process took only a couple of seconds to unfold. So now smoke had been released from the hard drive.

Now I created a RMA for Seagate to have the drive replaced. Well the RMA process is easy but packing the drive was not easy. Seagate has very strict packaging requirements for their hard drives which cost me about as much to package the drive as it would for me to have bought a new drive. The cost of shipping the drive was about $35 which includes the packaging cost and shipping cost.

The data that was on the drive was far more important than the hardware which is how it typically is now. The big hard drives are storing hundreds and sometimes even thousands of dollars of purchased music, videos, and other data. Some of the data didn't cost money but it cost a lot of time because it is an original work.

The drive arrived at Seagate and it took them about 4 days to notify receipt of the package in their system and then to ship a new hard drive. They shipped the drive on a Friday which because UPS doesn't move packages on Saturday or Sunday meant that the package would just sit in the UPS warehouse near McAllen, Texas over the weekend. The drive made it to my place on the following Thursday. I opened it up and it was a different drive model. The drive was a Seagate 7200.10 instead of the 7200.9 that I sent in. That meant that the drive I received used the new PMR technology and so I decided to make that my main system drive. The drive was also an upgrade to 320G and a larger cache. So I prepared the drive my formating it and using the GUID partition table. I no longer use the MBR method.

I spent a couple of hours reading about how to properly partition the drive with GPT and then commenced on the partitioning. I then formatted the drive. Then I rebooted the computer and booted into a livecd and began the process of copying the OS. The copying of the OS appeared to be going well and no problems had been noticed. Well the copying finished and I went to install grub on the new hard drive with no luck. At that point I new something was fishy because I couldn't recompile grub either. Well Kathryn was coming over to watch a movie that night and so I stopped what I was doing for a couple of hours and booted the computer on the new drive by using the grub on the old drive to load the system on the new drive. The boot went well however I still couldn't load grub and I would get the error, "No such file or directory" while trying to load grub. I couldn't compile the program either. All of the 64-bit binaries were working and that was the most important part for now.

I finished watching the movie, "The First Wives Club" and then recommenced work on figuring out the problem. I took some time to google the problem and I decided that there was something wrong with the toolchain. The toolchain includes binutils, gcc, glibc. I reinstalled binutils just fine but gcc wouldn't compile. I then tried glibc and the problem was fixed. So I don't know where the problem was but I did try to recopy the 32-bit libs from the old drive more than once and the problem persisted. Reinstalling glibc fixed the problem which to me means that the old drive may have been having some problems with data integrity.

So I decided to repartition my 1TB hard drive and format it with a new EXT4 partition. I stayed up late copying over the data to my new drive and then over to the old 200G hard drive. The copying went smooth and nothing to cause me concern. I was tired and went to bed while this went. I had also done a full format on the 200G hard drive using Seagate's Seatools to make sure the drive was trustworthy and that if there were surface issues that the bad sectors would be remapped before I copied the files onto it.

I woke up the next morning and I deleted the old partition on the 1TB hard drive and then proceeded to repartition the drive. The gdisk tool by default aligns partitions on the 1MB boundary to account for the new drives that are reporting 512 byte sectors but really have 4092 byte sectors. Well I deleted the partition and when I went to format the drive the tool mkfs.ext4 said that the partition was still mounted. I unmounted the old partition and then proceeded with the format. After the format was done I copied all the files back onto the drive and everything looked good. I decided to reboot the computer just to make sure the 1TB drive which is my home drive would automatically mount. The mounting is done by the UUID and so if I mess that up the drive won't mount at boot. Well the drive didn't mount as expected and so I thought to myself what could have gone wrong. The first thing on my mind is that I didn't get the UUID correct. Well I checked the the UUID didn't show when I ran the tool blkid. The partition wouldn't mount if I manually tried to mount by the device name directly and so now I knew that something was up. So then I tried to sent some commands to mount to try to force it to mount and no luck. I started to panic slightly because I had deleted the files off of my other drives by now since I thought everything went smoothly. Well I got onto the Gentoo IRC channel and asked for some emergency help. Someone suggested that I just redo the partition map and I tried that. I knew where the old partition started and OS I went and did that and made sure that the partition started at sector 63 which is standard for the old MBR partition scheme. I then went and attempted to mount again and I had success! Now I'm not out of a few hundred dollars and thousands of hours of personal time in original works.

I then redid the whole process of copying files off the drive to my two drives. I then redid the partition again and this time I didn't have the problem of a home drive being mounted when it wasn't supposed to be. I formated the drive and I had success there! I copied the files and had success again and now I have the partitions aligned on the 1MB boundary and new EXT4 file systems. It wasn't very funny though thinking that I had lost all of my data. Now I've thought that eventually I might get a backup hard drive and store it in a vault. I was very relieved after the final successes and then reprocessed the fireworks videos last night.

P.S. Make sure to unmount the partition before repartitioning the drive. It'll save a lot of stress and anxiety.

The Video Flow in Linux

January 8th, 2010

So I was given a camera for Christmas that can record video. This brought me to sort of re-learning or better learning the video flow in Linux using the many utilities. If you are used to working with audio you are still in a peaceful land. Doing video is going to the wild west which hasn't been tamed yet.

The tools of the trade are libquicktime, ffmpeg, mplayer, cinelerra, mjpegtools, and xine.

I use a combination of libquicktime, ffmpeg, and mjpegtools to get the video ready for import into Cinelerra. Getting the video into a format that can be read by Cinelerra is rather tricky.

Cinelerra reads the quicktime files the best and is able to render back to quicktime mov files the best as well. The recommended video encoding is mpeg4, or h.264. Now what is made by most cameras that are recorded in either of those video codecs won't load into Cinelerra. I have found that for flawless import I convert the video into raw YUV which produces a rather large file by running the following command.

 

ffmpeg -i $file -f yuv4mpegpipe - | y4mtoqt -k -o $output_file

 

The command loads the file into ffmpeg and pipes it out into a format called yuv4mpeg and that can be used in the mjpegtools. The command y4mtoqt sends writes the data back out in the pixel format that was specified with ffmpeg. The -k option is used for when the color is reversed.

The next thing we need is to decode the audio and re-encode it into another file that will be imported to Cinelerra. I use ffmpeg to do that and the audio codec I use is pcm_s16be. The command looks like this:

 

ffmpeg -i $inputfile -vn -acodec pcm_s16be -ar 48000 $outputfile

 

The flag -acodec specifies the audio codec and the -ar specifies the audio bitrate.

The next thing is to load the files into Cinelerra and do the editing that you'd like. Cinelerra exceeds the scope of this entry but may be covered in a later entry. Suffice to say to export I use either the yuv4mpegpipe to encode just the video for later multiplexing the audio back into the video or I use the quicktime export and select one of the yuv options along with two compliment audio.

Once the project has been exported and there is one file with the audio and video multiplexed together then I use ffmpeg once again to encode to a smaller file. I use the following command:

ffmpeg -i $inputfile -vcodec libx264 -vpre slowfirstpass -s 512x288 -vb 400k -pass 1 -acodec libfaac $outputfile.mp4

This specifies the vcodec to use the libx264 to encode into the h.264 compressed video format. -vpre is a video preset for ffmpeg to use. -s sets the size of the output video and if ffmpeg needs to resize that will be done too. -pass 1 specifies the pass number that I'm doing since I want to do two pass encoding to get the file well compressed at a fairly constant bitrate. -vb specifies the video bitrate that I want to use. I also specified libfaac for the audio codec so that I can get aac audio into the stream.

The next thing to do is to do the second pass on the video encoding. The first pass could take several hours depending on the length of the video. I'll run the same command with the exception of -pass which I'll change to -pass 2 and -vpre which also gets change to -vpre max.

 

ffmpeg -i $inputfile -vcodec libx264 -vpre max -s 512x288 -vb 400k -pass 2 -acodec libfaac $outputfile.mp4

 

The file can be copied to the website but the entire file will have to be downloaded to start playback. To fix this there is a program called qt-faststart. The command looks like this:

 

qt-faststart $inputfile $outputfile

 

Now I have a video file that can be copied to the website and is ready for viewing. The file is as small as I want and with the quality that I'd also like.

To summarize the commands that I run to make this all happen:

 

ffmpeg -i m4h00005.mp4 -f yuv4mpegpipe - | y4mtoqt -k -o m4h00005v.mov

ffmpeg -i m4h00005.mp4 -vn -acodec pcm_s16be -ar 48000 m4h00005a.mov cinelerra

ffmpeg -i m4h00005.mov -vcodec libx264 -vpre slowfirstpass --s 512x288 pass 1 -vb 400k -acodec libfaac m4h00005.mp4

ffmpeg -i m4h00005.mov -vcodec libx264 -vpre max -s 512x288 -pass 2 -vb 400k -acodec libfaac webvideo.mp4 qt-faststart webvideo.mp4 webvideo-fast.mp4

 

That is everything summarized. I hope that his helps someone to get video editing working in Linux.

With ffmpeg the video can be encoded into mpeg4 and imported into Cinelerra and with mp3 audio encoding. I prefer not to use mp3 since it is lossy and that is bad when editing video and audio. So for mpeg4 video I use the following command to import into Cinelerra:

ffmpeg -i $inputfile -vcodec mpeg4 -strict very -sameq -acodec pcm_s16be $outputfile.mov

That will get the video compressed and into a format that Cinelerra can read and export.