Friday 23 December 2011

Formatting Video for iPad

This, I have to say, has been one of the most gratifying things to work out in my entire experience of working stuff out. The iPad is an excellent device for consuming media, especially video. The problem is getting video in a format suitable for the iPad without forking over dosh to Apple.

The best I have been able to come up with to date is Handbrake, which has built in modes for the iPad. What it doesn't do is batch convert a whole directory to iPad format.

So joy of joys, I present the Ubuntu commands necessary for this (assuming you have installed the requisite packages all of which are listed in my how to build a custom Live CD posts).

target_resolution="1024x576" &&
target_bitrate="2.5M" &&
minimum_bitrate="0k" &&
maximum_bitrate="3.5M" &&
audio_bitrate="256k" &&
audio_sample_rate="48000" &&
audio_channels="2" &&
input_extension="mkv" &&
number_of_threads="8" &&
for file in *.$input_extension; do \
ffmpeg -y -i "$file" \
-pass 1 -f mp4 \
-s $target_resolution -vcodec libx264 -threads $number_of_threads -b $target_bitrate \
-bt 100k -maxrate $maximum_bitrate -minrate $minimum_bitrate -bufsize 2M \
-flags2 +mixed_refs -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 5 -trellis 2 -refs 2 -coder 0 -me_method umh -me_range 90 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 \
-rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -g 90 -an /dev/null && \
ffmpeg -y  -i "$file" \
-pass 2 -f mp4 \
-acodec libfaac -ar $audio_sample_rate -ab $audio_bitrate -ac $audio_channels \
-s $target_resolution -vcodec libx264 -threads 8  -async 2205 -b $target_bitrate \
-bt 100k -maxrate $maximum_bitrate -minrate $minimum_bitrate -bufsize 2M \
-flags2 +mixed_refs -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 5 -trellis 2 -refs 2 -coder 0 -me_method umh -me_range 90 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 \
-rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -g 90 \
"${file%.$input_extension}.mp4" \
 ; done

You fill in the details between the quotes at the beginning, copy and paste the code into a command line in a directory containing video files of type .input_extension and it will convert all the files to .mp4 suitable for playing on an iPad. All files will have exactly the same resolution and bitrate, which may not be desirable, so check in advance by examining your files.

You can change the target bitrate to whatever you like, within reason, and the iPad will still play the files. You will want to carefully check the aspect ratio of the files going in to make sure that these settings work. If they start in 16:9 this will spit them out in 16:9 at 1024x576. 4:3 files will need the target_resolution changed to "1024x768" for instance.

Don't make the mistake I made of running several instances of this command at the same time - because they will overwrite each others variables.

As a bonus, here are the commands necessary to extract audio from a clip in an iPod/iPhone friendly format:

audio_bitrate="128k" &&
audio_sample_rate="48000" &&
audio_channels="2" &&
input_extension="flv" &&
for file in *.$input_extension; do \
ffmpeg -y -i "$file" -vn \
-acodec libfaac -ar $audio_sample_rate -ab $audio_bitrate -ac $audio_channels \
"${file%.$input_extension}.m4a" \
; done

Same rules apply.

Friday 11 November 2011

Fixing a fucked Grub2

I have recently encountered a fucked Grub2. This is the latest version of Grub that comes with, amongst other things, Ubuntu Natty. You can't fix it the way I described in the past. That's for Grub1 only.

Why did I end up with a fucked grub? I was stupid. It was entirely my fault. You see, I BELIEVED that this time, when I upgraded my Ubuntu installation from Natty to Oneiric, I wouldn't end up with an unresponsive pile of crap as the result. More fool me.

Because I am not a total buffoon, I made a drive image before attempting the upgrade. I thought I would give True Image Home 2011 another go at being an actual backup program rather than a pointless waste of time. Surprisingly it actually worked. Just about. It had a funny check box thing with some nonsensical question about hard drive ID or some such other thing. I didn't check it. In my experience checking boxes when you do not know what they do is a bad thing.

I now know what this box was asking. It was asking "do you want me not to fuck up your grub when I restore this image?". Oh dear.

So, I was left with a non-working grub. Helpfully I managed to sort it all out, by running these commands from a terminal window on a LiveUSB system.

First of all I mounted the recalcitrant drive into the LiveUSB environment. That's sounds scary. In practice, it means I clicked on the "Places" menu and selected the anonymous drive that was the same size as my Ubuntu, rather than Home, partition on the disk in question.

I then got my terminal window open, and found out where this partition had been mounted by running

mount | tail -1


This command prints the last line [tail -1] of the output of the command to tell you about all [mount]ed partitions. The partition we are interested in should be the last one, because we only just mounted it.

The output I got was along these lines:

/dev/[sdxy] on /media/[string of letters and numbers] type ext4 (rw,nosuid,nodev,uhelper=devkit)


That told me the name of the partition [/dev/[sdxy]] and where it was mounted [/media/[string of letters and numbers]]. That's all I need to know. The next job was to double check that this was the right partition. To do that I just [l]i[s]ted the contents of the [boot] folder on that partition as follows:

ls /media/[string of letters and numbers]/boot


This properly displayed the contents of the boot folder, so I knew I was on the right track. The last step was to run the command to fix everything. This was:

sudo grub-install --boot-directory=/media/[string of letters and numbers]/boot /dev/[sdx]


Please note that I was installing the grub to /dev/[sdx] and NOT /dev/[sdxy], for whatever x and y I got.

And then, surprisingly, it worked fine.

Friday 4 November 2011

More Ubuntu Wifi Joy

My neighbours are flooding my house with super strength wifi signals. or so it seems. I can't get a reliable signal in the upstairs room where the desktop machine is located. My routers is already set on the "I hate my neighbours, make my signal as loud as possible", setting and that is no longer doing the business.

The obvious solution is to change the channel on which the router is broadcasting. The router I use has an auto setting for this, which I presume means is scans around to find a clear channel and then uses that.

This would be an excellent solution if solving IT problems was completely unlike playing fucking Jenga. Yes, this setting change has fixed the reception problem upstairs, but now the netbook refuses to detect the wifi at all. Fucking spiffing.

This is a driver and software problem. I know this because I can boot to a USB Key, and it works find. I can dual boot to my LFS installation (see posts passim) and that works fine as well. It's just the installed ubuntu that doesn't fucking work. All I did was change the wifi channel for fucks sake. No amount of cursing and swearing appeared to fix this.

I eventually decided to manually install the same network drivers as I have running on the LFS installation. This worked fine. Until the next time I rebooted, when they were overwritten by later drivers, which don't fucking work. I have tried to uninstall the newer drivers, but it's as hard to get rid of the fuckers as it is Michael Meyers. Every time I reboot, the bastards pop into existence once more. Pain in the arse.

(Incidentally the fucking useless drivers, on whatever channel my router is now using because they worked find on the old one, are the broadcom hybrid sta drivers 5.100.something.something. The drivers that actually fucking work are version 5.60.48.36)

All of this explains why I now have a script on my desktop called "get_my_fucking_network_working.sh" which contains the following commands:

cd /tmp
cp '/home/[user]/Dropbox/Essential Drivers/Wireless Drivers/hybrid-portsrc-x86_32-v5.60.48.36.tar.gz' .
mkdir hybrid_wl
cd hybrid_wl
tar -xzvf ../hybrid-portsrc-x86_32-v5.60.48.36.tar.gz
unzip '/home/[user]/Dropbox/Essential Drivers/Wireless Drivers/sta_5.60.48.36_2.6.33_kernel_patch.zip'
patch -p0 < patch
unzip '/home/[user]/Dropbox/Essential Drivers/Wireless Drivers/sta_5.60.48.36_2.6.34_multicast_kernel_patch.zip'
patch -p0 < patch_hybrid_multicast
make clean
make
sudo rmmod b43
sudo rmmod b44
sudo rmmod b43legacy
sudo rmmod wl
sudo rmmod ssb
sudo rmmod ndiswrapper
sudo rmmod lib80211_crypt_tkip
sudo rmmod lib80211
sudo modprobe lib80211
sudo insmod wl


I have to run this script every time I boot. Joy of all joys.

Friday 14 October 2011

Evolution

Unable to retrieve message Lost connection to Evolution Exchange backend process Absolute fucking bastard. I hate evolution and the way it connects, or rather doesn't connect, to exchange server, with the passion of a thousand, whatever it is's that makes gamma-ray bursts.

Friday 16 September 2011

Recently


Recently, and surprisingly, not much has been pissing me off IT wise.  The main current niggle is that neither OSX or Windows 7 seem keen to boot from an eSata caddie, whereas Ubuntu just laps it up.

I do have a bit of a grumble, and warning to those less wary, though.  Having purchased an SSD I became something of an evangelist for them.  I convinced the boss at work to buy 14 64Gb SSD drives, and I proceeded to replace the whole office's HDD's with them.

Sadly not a week after delivery we got an email from Kingston, the manufacturer, to advice that there was a teensie problem with the firmware, which has a tiny chance of bricking the drive.  So we update all the firmware as suggested.

Yesterday I sent the third and fourth drives (out of 14) back to Kingston for warranty replacement.  They are SV100S2/64G drives, is anyone is interested.  When they brick, they really do brick.  No detection by the bios, no nothing.

On the plus side, Kingston's RMA system is very good, and we got the replacements for the first drives through quickly.  It's just a shame we have to use it at all.

Friday 26 August 2011

Installing OSX

Having failed to install OSX from the completely genuine (not really) disk image that I had obtained, I opted to shell out a few quid for a completely genuine (yes, actually) install disk. I ebayed a box set with iWork '09 and iLife '11 which came with all sorts of goodies like word processors, video editors and so on. All for £69. Microsoft, are you listening?

I then had a whole new problem when I tried to boot the system. I got a kernel error. This just froze up the whole machine when I tried to boot. I took out the PCI-E Soundblaster card, and hey presto no more kernel errors. This did not bode well though, as I did want sound eventually. Anyway, ploughing on.

I then had exactly the same bloody problem as with the previous attempt. It would freeze when starting up the installer. What to do?

The solution was actually very simple, but hideously complicated at the same time. What I had to do was to change the boot parameters for the OSX Install disk. This is like adding [nomodeset] when booting a Live Ubuntu Image with dodgy video drivers installed.

The command that I needed to add on were:

PCIRootUID=1

Once I did that the system was prepared to boot. However the wifi was not working. This is a pain, because part of the installation procedure allows you to set up you itunes account, and it needs a network connection for this. It WOULD get an internet connection if I booted the installer in safe mode (by adding [-x] as a boot parameter). Which is just a joy.

The next problem is getting the system to boot without using the iBoot disk. To do so I needed to use the multibeast program to install whatever magical stuff it needed to install to get the system to boot from the hard disk. The two essential parts of multibeast are EasyBeast and the System Tools. Apart from that I just selected packages to get my wireless card working. Happily once I rebooted, I got wifi working just fine. My video card was not detected though, and I put this down to getting an install disk that pre-dated (10.6.3) my video card (6850 Ati). So I then upgraded to OSX version 10.6.8. And hurrah! The video started working properly! And the wireless card was fucked.

Brilliant. I rebooted in safe mode [-x], and the kernel messages flying past tended to suggest the wireless card was detected and working, but that was all academic because the video was now fucked and showing a grey screen only.

Fantastic. It was at this point, after much searching on the internet, that I discovered that I needed to use the further boot parameters:

max_valid_dma_addr=1024

to get the wireless card to work with the latest version of OSX. Apparently this has something to do with how much memory your system has. All I needed to do was to pop those parameters into the /Extra/Library/Preferences/SystemConfiguration/com.apple.Boot.plist file:

<key>Kernel Flags</key>

<string>arch=i386 max_valid_dma_addr=1024</string>


Having stuck those settings in I now have a machine happily booting OSX 10.6.8, with fully working video up to 1080p resolutions, and with fully working wifi. Sound, no so much. The Soundblaster card I have is working after a fashion, but it is unusably crackly. I need to try a variety of the VoodooHDA drivers to find the best one. Alternatively, I need to try it without the card to see if it will output sound over the HDMI cable from the Graphics card. At one point it had detected that as a possible audio out option, but I couldn't connect it up to a receiver at the time to find out if it was actually working.

Friday 29 July 2011

Installing Latex and PGF

I smugly advised in my last post that you should simply look at my post to identify how to install Latex with PGF Tikz support. Turns out my previous post was as deficient as a very deficient thing on a day of particular deficiency.

Ahem.

What you need to do is to grab two files off t'internet. These are the TexLive Latex distribution (this is a biggie) and the PGF add on.

The TexLive distribution is best downloaded as a single file if you are going to be pissing about trying to repeatedly install it to get the fucker working. It is 2Gb though, so set aside some time for the download. You can find various download options, including torrents, at this page. My nearest CTAN mirror is in the UK, so my command to download the DVD Image is:

cd ~
wget http://mirror.ox.ac.uk/sites/ctan.org/systems/texlive/Images/texlive2011.iso

You can then grab the PGF file by running these commands:

wget http://sourceforge.net/projects/pgf/files/pgf/version%202.10/pgf_2.10.tds.zip/download
mv download pgf_2.10.tds.zip

However, if it is regularly updated, you would probably just want to go to this page and download the latest version manually.

To install the software first you need to mount the DVD image. You can do that as follows:

mkdir ~/texlivedvd
sudo mount -o loop ~/texlive2011.iso ~/texlivedvd

You can then run the graphical installer from TexLive. It requires the perl-tk package, which you can install as follows if you do not already have it:

sudo apt-get install perl-tk

To then run the installer, you use this command:

sudo ~/texlivedvd/install-tl -gui perltk

You want to install as comprehensive a system as possible, but you can omit language packs you are not going to use to save some time and space.

The best idea I have found to avoid permissions bullshit is to install to your home directory, so you would choose an install path like [~/texlive].

Once installed, you want to tell your system where to find the nice new files, which you can do as follows:

cat >> ~/.bashrc << "EOF"
PATH=~/texlive/2011/bin/i386-linux:$PATH
export PATH
MANPATH=~/texlive/2011/texmf/doc/man:$MANPATH
INFOPATH=~/texlive/2011/texmf/doc/info:$INFOPATH
export MANPATH
export INFOPATH
EOF
source ~/.bashrc
What that command does is add those lines to your [.bashrc] file. They add the folders containing the TexLive install to your path so that other programs (like ktikz) can find them. The last command just processes the file to prevent you from having to reboot. I am not entirely sure, but I think this may only affect programs run from the command line, because I have been having problems with ktikz if I run it from the desktop icon. Installing PGF is much easier. Just change into the correct TexLive directory, and unpack the archive:
cd ~/texlive/2010/texmf/
sudo unzip ~/pgf_2.10.tds.zip
sudo texhash
The [texhash] command just (I think) lets the rest of TexLive know that this extra stuff has been installed. That's it for PGF. To install ktikz you are going to need a few packages. Install them as follows:
sudo apt-get install build-essential cmake libqt4-dev qt4-dev-tools libpoppler-qt4-dev kdelibs5 kdelibs5-dev khelpcenter4
Basically what you are doing here is installing software that supports KDE applications. If you happen to be running KDE anyway you should have most of this stuff. Once installed, you can grab the ktikz source code, compile and install it using these commands:
cd ~
wget http://www.hackenberger.at/ktikz/ktikz_0.10.tar.gz
tar -xzvf ktikz_0.10.tar.gz
cd ktikz
mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX=`kde4-config --prefix` ..
make -j2
sudo make install
That should be that. Remember that ktikz needs to load up the PGF packages you want to use. Common ones for me are loaded as follows:
\usetikzlibrary{calc,through, intersections,decorations.text, decorations.pathreplacing}

Friday 22 July 2011

Detailed Tikz Animations

I set out my workflow to create animations using the Tikz graphical language in a previous post. What follows is, as promised, the individual Ubuntu commands needed to actually implement that workflow:

First of all we need to create a decent image in ktikz. This code works well for this purpose:

\usetikzlibrary{calc,through, intersections,decorations.text}
\begin{tikzpicture} [scale=3]
\def\myangle{117};
\clip (-1.5,-1.5) rectangle (1.5,1.5);
\coordinate (A) at (0,0);
\coordinate (B) at (1,0);
\node [blue, name path=blue_circle,draw,circle through=(B)] at (A) {};
\draw [black, fill] (A) circle (1pt) node [below] {\tiny centre};
\draw [red, dashed] (A) -- (B);
\path [name path=radius, rotate=\myangle] (A) -- ++(1.5,0);
\draw [red, ->] ($(A)+(0.5,0)$) arc (0:\myangle:0.5cm);
\path [decorate,decoration={raise=-5pt, text along path, text={|\tiny|angle ||}, text align=center, text color=red, reverse path}](0.5,0) arc (0:\myangle:0.5cm);
\draw [name intersections={of=blue_circle and radius, by=C}] [orange, ->] (A) --  (C) node [pos=0.7, sloped, above] {\tiny radius};
\end{tikzpicture}

What does all that do anyway? Well not is not the time for a full PGF/Tikz tutorial, but some illustration may be of use:

\usetikzlibrary{calc,through, intersections,decorations.text}
That line tells the Tikz software that it is going to need the extra routines mentioned in the squiggly brackets to draw the picture. The libraries mentioned are calc, to enable mathematical routines to be used on co-ordinates (like start here and move two times the number you first thought of to the left), through, to draw objects through other objects, intersections, to let us calculate the point where two lines cross, and decorations.text, which lets us draw text along squiggly lines.

\begin{tikzpicture} [scale=3]
This [begin]s the code to draw the [tikz] [picture]. The whole picture is at [scale] [3] (where 1 would be normal) so everything is nice and big and clear.

\def\myangle{117}
This is the key to our whole animation. We are setting a variable which we can then change using sed (aaaargh).

\clip (-1.5,-1.5) rectangle (-1.5,-1.5);
We tell the system we are only interested in the [rectangle] shaped region designated by the two corners [(-1.5,-1.5)] and [(-1.5,-1.5)], so it doesn't need to draw anything that falls outside that region.

\coordinate (A) at (0,0);
\coordinate (B) at (1,0);
These commands do the same thing for two different points. We are basically just giving a name to to points [(0,0)] and [(1,0)]. This means that in the future when we want to refer to the points we can just use the letters [A] and [B] instead of remembering the exact positions.

\node [blue, name path=blue_circle,draw,circle through=(B)] at (A) {};
That is the first really scary command. All it does is draw a circle. Why is it not in the form [draw] a [circle]? Well, we are using the [through] library here, which only works with nodes. So we have to stick a node down [at (A)] with [{}] no text attached to it. This is just a hypothetical point. However, we then [draw] around the node a [circle], coloured [blue], [name]d [blue_circle] which passes [through] the point called [B].

\draw [black, fill] (A) circle (1pt) node [below] {\tiny centre};
This command DOES just [draw] a [circle]. The circle is coloured [black] and is [fill]ed in, so it looks like a dot. It is centred at point [A], and has a radius of [1pt], which is a tenth of a unit long. We also attach a [node] to the circle (instead of the other way round like last time). This node is positioned [below] point A. The node has the text [centre] attached to it in a [tiny] font size.

\draw [red, dashed] (A) -- (B);
This is perhaps the least scary command we have seen so far. This [draw]s a [red] [dashed] line ([--]) between the points [A] and [B]. Simples. You can see, though, why we would want to define A and B earlier, since we are using them so often. It would be a pain to have to keep refering to the exact position all the time.

\path [name path=radius, rotate=\myangle] (A) -- ++(1.5,0);
This command looks quite different. It marks an invisible [path] on the diagram. Why invisible? Well there is no [draw] command. What's the point in this? Wait and see. The path is [name]d [radius]. It is a line ([--]) from the point [A] to a point which is [1.5] units further [++] from the centre in along the x axis and [0] units further from the centre along the y axis. In other words, you take the point A, which is (0,0), and you add [++] 1.5 to the x co-ordinate (the first 0) and 0 to the y co-ordinate (the second 0). All this does is marks an invisible line one and a half units long along the x axis. But that it not all we do! We then [rotate] the line by [\myangle] degrees. If you recall, this was the variable we set right at the beginning. The idea is that when the variable is changed by sed (aaaargh) it will filter down to this instruction as well.

\draw [red, ->] ($(A)+(0.5,0)$) arc (0:\myangle:0.5cm);
OK, another scary one. This at least [draw]s something. It actually draws an [arc]. The arc is coloured [red] and has a [>] at the end of it. If it had a [>] at the beginning the command would have a [<] in it instead. The arc starts at the point (A) [+] [0.5] units along the x axis. This uses the calc library to work out the starting co-ordinate for the arc from A. To use the calc library you put the calculation itself in between the [$] symbols. Because the calculation returns a bare co-ordinate, you need to have the whole thing in brackets as well. The arc itself is from [0] degrees on a circle of radius [0.5cm] to [\myangle] degrees (there it is again). This will draw a varyingly large section of a circle that is still centred on the point (A). Why centred on point (A)? Well the zero degree point of the circle is drawn half a unit away from (A) and the radius is half a unit long. So automatically it centres on (A).
\path [decorate,decoration={raise=-5pt, text along path, text={|\tiny|angle ||}, text align=center, text color=red, reverse path}] ($(A)+(0.5,0)$) arc (0:\myangle:0.5cm);
That is fairly horrible. First of all, ignore the bit in square brackets. This command starts by marking another of these invisible paths. The path is actually exactly the same [($(A)+(0.5,0)$) arc (0:\myangle:0.5cm)] as we drew in red last time. Why a new command? Well the bit in square brackets at the beginning does not play well with actually drawing the arc.

So what's in the square brackets? These commands use the decorations.text library to draw [text] [along] the [path] of the arc. This means the text rounds itself to the arc which is pleasing to look at. The text is [raise]s by [-5pt] or half a unit, from the path itself. It is [align]ed to the [centre] of the path, coloured [red] and [reverse]d (otherwise it would appear upside down). The text itself is [angle] and is in a [tiny] font size. It is a complete mystery to me why these text settings uses a different format to the text settings for nodes, but he ho, that's life.

\draw [name intersections={of=blue_circle and radius, by=C}] [orange, ->] (A) -- (C) node [pos=0.7, sloped, above] {\tiny radius}
This is our last drawing command. What this does is [draw]s a line [--] between points [A] and [C]. Hang on though, where is point C? Well, look at the bit in square brackets at the very beginning of that. That bit uses the intersections library to [name] the [intersection] [of] the [blue_circle] and the invisible line called [radius]. The name given is [C]. This becomes a co-ordinate we can use just like A and B, although we never need to know exactly what its position is. So we can merrily change the size of the angle, and this command will always know where the radius meets the circle.

The line we end up drawing is [orange], just to be different, and has a [>] bit on the end of it. It has a node [above] the line, which is [sloped] along the line. The slope works with lines, but not with arcs, which is why we use the decorations library for the arc. The node is [pos]itioned 70% [0.7] of the way to point C. The node contains the text [radius] written in [tiny] font size.

\end{tikzpicture}
This line [end]s the code to draw the [tikz] [picture].

OK, good. What does that draw if myangle is 117? It draws this:


Hopefully you can see that it is a nice blue circle, with a line drawn at 117 degrees from the horizontal dashed red line, with an arc half the size of the blue circle in red marking the size of the angle.

Right, how do we animate this? First we need to generate a file which can be read by the Ubuntu command line program [pdflatex]. This has to be a proper Latex file, not just a series of Tikz commands like above. So we need to top and tail our command to generate the proper latex file. We add these commands at the beginning:

\documentclass{article}
\usepackage{tikz}
\begin{document}

And this command at the end:

\end{document}

We also need to change the 117 angle to something that we can easily find and replace using sed (aaargh). A random collection of letters should mean that we do not get any false matches. So we end up with:

\documentclass{article}
\usepackage{tikz}
\begin{document}
\usetikzlibrary{calc,through, intersections,decorations.text}
\begin{tikzpicture} [scale=3]
\def\myangle{xyzzy};
\clip (-1.5,-1.5) rectangle (1.5,1.5);
\coordinate (A) at (0,0);
\coordinate (B) at (1,0);
\node [blue, name path=blue_circle,draw,circle through=(B)] at (A) {};
\draw [black, fill] (A) circle (1pt) node [below] {\tiny centre};
\draw [red, dashed] (A) -- (B);
\path [name path=radius, rotate=\myangle] (A) -- ++(1.5,0);
\draw [red, ->] ($(A)+(0.5,0)$) arc (0:\myangle:0.5cm);
\path [decorate,decoration={raise=-5pt, text along path, text={|\tiny|angle ||}, text align=center, text color=red, reverse path}](0.5,0) arc (0:\myangle:0.5cm);
\draw [name intersections={of=blue_circle and radius, by=C}] [orange, ->] (A) --  (C) node [pos=0.7, sloped, above] {\tiny radius};
\end{tikzpicture}
\end{document}

You copy that code, and paste it into [gedit] and save it to a file called, say, [master_frame.tex] in your [~] home directory. Or you could paste this command into a terminal window, which does the same thing:

cat > ~/master_frame.tex << "EOF"
\documentclass{article}
\usepackage{tikz}
\begin{document}
\usetikzlibrary{calc,through, intersections,decorations.text}
\begin{tikzpicture} [scale=3]
\def\myangle{xyzzy};
\clip (-1.5,-1.5) rectangle (1.5,1.5);
\coordinate (A) at (0,0);
\coordinate (B) at (1,0);
\node [blue, name path=blue_circle,draw,circle through=(B)] at (A) {};
\draw [black, fill] (A) circle (1pt) node [below] {\tiny centre};
\draw [red, dashed] (A) -- (B);
\path [name path=radius, rotate=\myangle] (A) -- ++(1.5,0);
\draw [red, ->] ($(A)+(0.5,0)$) arc (0:\myangle:0.5cm);
\path [decorate,decoration={raise=-5pt, text along path, text={|\tiny|angle ||}, text align=center, text color=red, reverse path}](0.5,0) arc (0:\myangle:0.5cm);
\draw [name intersections={of=blue_circle and radius, by=C}] [orange, ->] (A) --  (C) node [pos=0.7, sloped, above] {\tiny radius};
\end{tikzpicture}
\end{document}
EOF

When you run [pdflatex] it takes the master_frame.tex file and turns it into a pdf document. This is not what we want. We want a series of jpegs. We also do not want an A4 page, which is what [pdflatex] is going to give us. So what we do is convert a test sample, work out what crop we need to apply (to get rid of the unnecessary areas) and note that down somewhere. If we simply run pdflatex on our master frame it is going to fail, because it will try and draw an angle of size xyzzy, which makes no sense. So first lets create our test frame:

sed 's/{xyzzy}/{'117'}/' ~/master_frame.tex > ~/test.tex
And we now generate the pdf file from it:

pdflatex ~/test.tex

That should work. It it doesn't, something has gone wrong with your installation of latex and pgf. Go and look at my post on how to do that to make sure you have covered all the bases. To display the pdf you can open it in any pdf viewer - it is just a pdf. However, we need a jpeg, so let's convert it:

convert -density 300 test.pdf test.jpg

The [convert] command makes sense, but what about the [density 300] stuff? Well the pdf we have made is of vector graphics which can scale without losing detail to any resolution you like. A jpeg image is essentially raster graphics which has a fixed resolution. To convert vector to raster you need to tell the vector image what resolution to scale to. here we have chosen [300] dpi.

We now need to get our cropping parameters. To get these run the [display] command (from the imagemagick package I think):

display test.jpg

When the image opens, left click anywhere on it to bring up a menu. Select the Transform drop down menu and then click on crop. This lets you draw a box around the image. While you are doing this, look in the top left corner of the display window. You will see numbers changing as you draw your box. When you are satisfied that it fits the image and no more, note down those numbers AND symbols. For me they were [724x724+789+692]. Now close the display window. We are done with it. If it fights back, click on your terminal window and hit [CTRL+C] to shut it down. Now let's test out cropping parameters by re-doing a conversion to jpeg:

convert -density 300 -crop 724x724+789+692 test.pdf test.jpg

Note that the numbers and symbols we noted in the last step just get pasted in here, after the [crop] command. Nice and logical. If you now do the [display test.jpg] command again, you should find a nicely cropped picture of our diagram. If not, go back and check your cropping parameters are correct.

We are now going to generate a whole lot of [.tex] files - one for each frame of the animation we are going to make. It would probably be best to put these in their own directory:

mkdir ~/frames

You can now generate all your frames by running this command:

for angle in {1..360}; do  sed 's/{xyzzy}/{'$angle'}/' ~/master_frame.tex > ~/frames/$angle.tex; echo $angle; done

That looks pretty horrendous, doesn't it? Well, what is happening here is a loop. We define a variable at the beginning of the command called [angle]. We then run through the hold command setting the variable to every number from [1] to [..] [360]. The variable is then passed to the next section of the command which [do]es the [sed] (aaargh) command on the [~/master_frame.tex] file. The sed command spits out the file, but changes every instance of the text [{xyzzy}] to the same squiggly brackets but with the value assigned to angle in between the brackets. This sets our angle size for the frame. The result from the sed command is sent [>] to a [.tex] file in the [~/frames/] folder named after the size of angle in the file. The command then reports [echo] which frame it has completed, and [done] goes back to the beginning for a new number.

We now need to move into the frames folder for the next bit:

cd ~/frames

Now if you look at the contents of the frames directory they will not all be three digit file names, meaning they count up from 1 to 99 to 360. The leading zeros are missing, i.e. they are not 001 to 099 to 360 and our system will break. You can fix that, by adding the leading zeros with these commands:

for file in ?.tex; do cat $file > 00$file; rm $file; echo $file; done
for file in ??.tex; do cat $file > 0$file; rm $file; echo $file; done

Why could we not just generate angles from 001..360? If we do that pdflatex thinks we are using base 8, or octal numbers, and misses out every ninth and tenth frame. We can now do the mass generation of all the pdf files. This will take a while. The command is as follows:

for file in ???.tex; do pdflatex -interaction=batchmode $file; echo $file; done

This is another loop, that picks up every [.tex] file with three characters in its name, and sticks it into the [pdflatex] command. Next we need to use our cropping details to produce the jpeg files:

for file in *.pdf; do convert -density 300 -crop 724x724+789+692  $file ${file%.???}.jpg; echo $file; done

This just uses a similar loop to the last command. It applies to all the [.pdf] files in the folder - so make sure that you started with an empty one. The percentage symbol is clever, it deletes the extension and replaces it with [.jpg] so the output of the command is a whole lot of [.jpg] files with the same name as the [.pdf] files which generated them. If we are really clever, we can just duplicate all our frames in reverse order to make the animation reverse when it gets to the end. This creates an unending loop, but is not strictly necessary with our animation here which is essentially circular.

marker=360; for file in *.jpg; do frame=$(($marker+360)); cp $file $frame.jpg; marker=$(($marker -1)); echo $marker; done

That is absolutely a nightmare. That it does is set a complicated loop in place counting back from 360 to 1. Why? Well we want the 360 frames we have already to be played in reverse. So we want 361 to be the same as 360, and 362 to be 359 and so on down to 1 being 720. So we start at 360 and count back. The [do] command sets the [frame] variable to be the marker PLUS 360. So when the marker is 360 it is 720. However, the [file] variable is going through the directory in numerical order. So the next command [c]o[p]ies the [file] to the frame number. The loop then reduces the marker by one and goes back to the beginning. The sequence looks like this:

Marker-File-Maths-Frame
360-1-(360+360)-720
359-2-(359+360)-719
358-3-(358+360)-718
...
2-359-(2+360)-362
1-360-(1+360)-361

It is a bit of a wrestle, but it gets there.

Now we have all our jpeg frames, we can use [ffmpeg] to compress them into an mp4 video file like so:

ffmpeg -r 25 -b 2500k -i %3d.jpg -pass 1 -y circle.mp4 && ffmpeg -r 25 -b 2500k -i %3d.jpg -pass 2 -y circle.mp4

That's another horror, but it breaks down like this: It uses the [ffmpeg] command set at a f[r]ame rate of [25] per second, and a [b]itrate of [2500k] per second, and takes its [i]nput as every [.jpg] file in the current folder with a three digit name. It runs a first [1] [pass] through the file to see where best to use its bitrate and saves the output to [circle.mp4] automatically answering [y]es to any question about overwriting that file. It then immediately [&&] runs a second [2] [pass] with the same input and settings to produce the final output in the file called [circle.mp4].

You can now view your creation by running:

mplayer circle.mp4 -loop 0

It should look like this:



Once you have done it once, it is useful to be able to tweak one entry in the master frame file and then run all the commands that follow automatically. You can do that with this command block. It first cleans out the ~/frames folder of everything, and then runs all the commands above, one after the other:

cd ~/frames && rm * && cd ~ && for angle in {1..360}; do  sed 's/{xyzzy}/{'$angle'}/' ~/master_frame.tex > ~/frames/$angle.tex; echo $angle; done && cd ~/frames && for file in ?.tex; do cat $file > 00$file; rm $file; echo $file; done && for file in ??.tex; do cat $file > 0$file; rm $file; echo $file; done && for file in ???.tex; do pdflatex -interaction=batchmode $file; echo $file; done && for file in *.pdf; do convert -density 300 -crop 724x724+789+692  $file ${file%.???}.jpg; echo $file; done && marker=360; for file in *.jpg; do frame=$(($marker+360)); cp $file $frame.jpg; marker=$(($marker -1)); echo $marker; done && ffmpeg -r 25 -b 2500k -i %3d.jpg -pass 1 -y circle.mp4 && ffmpeg -r 25 -b 2500k -i %3d.jpg -pass 2 -y circle.mp4 && mplayer circle.mp4 -loop 0

Friday 15 July 2011

Installing OSX on a Hackintosh

Because I don't have enough grey hairs I have decided to try to install OSX on my Intel PC Hardware.  Couple of reasons for this.  One, I have an iPhone and and iPad and I want to see what the fuss is about Apple Personal Computing without selling a kidney to buy their hardware.  Two, it's a technical challenge innit.

I HAVE managed to install OSX on an AMD Frankenmachine, and it seemed to work OK, but is very slow.  The graphics card does not support any of the Steam Games I have for OSX (Portal et al).  If I put a bigger graphics card in the Frankentosh the PSU cries enough and shuts down.


So I have decided to do it properly this time.  I started off with a completely genuine (not really) image of Snow Leopard.  I wanted to install it onto a USB Key so I could boot from that.  OSX didn't like the DVD Image.  It refused to write it to the USB Key.  It said it needed to be "scanned" and then refused to scan it.

I managed to find a useful OSX command which forces the machine to write the image without scanning it, and it is thus:


sudo asr -noverify -source "/[location of image]/Mac OS X Install DVD.dmg" -target "/Volumes/[mount point of USB Key]"

This did at least manage to write the image to the USB Key, and it does seem to boot when using the iboot system.  However it failed miserably at the bit which says "When you get to the installation screen, open Utilities/Disk Utility.".  It doesn't do anything, it just sits there spinning a circle for fecking hours.  Ah well.

This maybe because I have failed to follow all the previous instructions about removing all non essential hardware from the machine, resulting in it being festooned with TV Cards, USB Peripherals and so on.  Or it could be because the completely genuine (not really) DVD Image I was using is not really completely genuine.

So I have shelled out for a boxed set of Snow Leopard, iLife '11 and iWork '09, and I will try this again with the genuine kit when it arrives.

Friday 8 July 2011

Animating TEX

Short post today but I wanted to set out my work-flow used when generating animated tikz diagrams.

1) Start with your tikz diagram. Instead of fixed values, define variables to be used in their place:

\def\angle1{x}

2) Run the sed (aaargh) command on that master file using some kind of loop which changes the value {x} each time through and then spits out the results into numerical .tex files.

2.5) rename all the .tex files to make sure they have the appropriate amount of trailing zeros.

3) Run PDF Latex on each individual .tex file to get a pdf.

4) Convert the PDF to jpg using an appropriate DPI and cropping factor - will vary for each master diagram, so generate a test first and id the correct cropping settings. The display [name].jpg command is very useful for this.

4.5) If desired run another loop to copy all the files in such a way as the animation reverses to create a continuous loop.

5) Take all the jpg files and pipe them into ffmpeg to compress to whatever you want, but x264 is useful. Animated gif may be attractive but ffmpeg doesn't like it.

Job done. More details and exact ubuntu commands to come.

Friday 1 July 2011

Firefox 5 from Source

These are the instructions for installing FF5 from source on the LFS system I built in earlier posts, assuming that you have not installed any version.

wget http://ftp.gnome.org/pub/gnome/sources/libIDL/0.8/libIDL-0.8.14.tar.bz2
tar -xjvf /sources/extras/libIDL-0.8.14.tar.bz2
cd libIDL-0.8.14
./configure --prefix=/usr &&
make
make install
cd ..
rm -rvf libIDL-0.8.14

wget http://www.python.org/ftp/python/2.6.4/Python-2.6.4.tar.bz2
wget http://www.linuxfromscratch.org/patches/blfs/svn/Python-2.6.4-bdb_4.8-1.patch
wget http://docs.python.org/ftp/python/doc/2.6/python-2.6-docs-html.tar.bz2
tar -xjvf /sources/extras/Python-2.6.4.tar.bz2
cd Python-2.6.4
sed -i "s/ndbm_libs = \[\]/ndbm_libs = ['gdbm', 'gdbm_compat']/" setup.py
patch -Np1 -i /sources/extras/Python-2.6.4-bdb_4.8-1.patch
./configure --prefix=/usr --enable-shared
make
make test
make install
chmod -v 755 /usr/lib/libpython2.6.so.1.0
install -v -m755 -d /usr/share/doc/Python-2.6.4/html
tar --strip-components=1 --no-same-owner --no-same-permissions -C /usr/share/doc/Python-2.6.4/html -xvf /sources/extras/python-2.6-docs-html.tar.bz2
cat >> /etc/profile << "EOF"
export PYTHONDOCS=/usr/share/doc/Python-2.6.4/html
EOF
cd ..
rm -rvf Python-2.6.4
wget http://www.tortall.net/projects/yasm/releases/yasm-1.0.1.tar.gz
cd /dev/shm
tar -xzvf /sources/extras/yasm-1.0.1.tar.gz
cd yasm-1.0.1
CC="gcc -fPIC" ./configure --prefix=/usr
time make $CORES_TO_USE
make install
wget http://cairographics.org/releases/cairo-1.10.2.tar.gz
cd /dev/shm
tar -xzvf /sources/desktop/cairo-1.10.2.tar.gz
cd cairo-1.10.2
./configure --prefix=/usr --enable-tee=yes
make $CORES_TO_USE
make install
cd ..
rm -rvf cairo-1.10.2
wget ftp://ftp.mozilla.org/pub/mozilla.org/firefox/releases/5.0/source/firefox-5.0.source.tar.bz2
tar -xjvf /sources/extras/firefox-5.0.source.tar.bz2
cd moz*
cat > .mozconfig << "EOF"
ac_add_options --enable-application=browser
. $topsrcdir/browser/config/mozconfig
mk_add_options MOZ_OBJDIR=@TOPSRCDIR@/../firefox-build
mk_add_options MOZ_MAKE_FLAGS="-j2"
ac_add_options --prefix=/opt/firefox5
ac_add_options --enable-optimize
ac_add_options --enable-system-cairo
ac_add_options --with-system-jpeg
#ac_add_options --with-system-png
ac_add_options --with-pthreads
ac_add_options --with-system-zlib
ac_add_options --disable-accessibility
ac_add_options --disable-crashreporter
ac_add_options --disable-dbus
ac_add_options --disable-gnomevfs
ac_add_options --disable-necko-wifi
ac_add_options --disable-installer
ac_add_options --disable-javaxpcom
ac_add_options --disable-tests
ac_add_options --disable-updater
ac_add_options --disable-libnotify
ac_add_options --enable-official-branding
ac_add_options --enable-safe-browsing
ac_add_options --enable-strip
EOF
time make -f client.mk build
make -f client.mk install
cd ..
rm -rvf moz*
cat >> /etc/ld.so.conf << "EOF"
# Extra Path so Firefox's libraries can be used by Flash10
/opt/firefox5/lib/firefox-5.0
# End of Extra Path.
EOF
wget http://curl.haxx.se/download/curl-7.20.0.tar.bz2
cd /dev/shm
tar -xjvf /sources/extras/curl-7.20.0.tar.bz2
cd curl-7.20.0
./configure --prefix=/usr
make $CORES_TO_USE
make install
find docs -name "Makefile*" -o -name "*.1" -o -name "*.3" | xargs rm
install -v -d -m755 /usr/share/doc/curl-7.20.0
cp -v -R docs/* /usr/share/doc/curl-7.20.0
wget http://fpdownload.macromedia.com/get/flashplayer/current/install_flash_player_10_linux.tar.gz
tar -xzvf install_flash_player_10_linux.tar.gz
mkdir -v /opt/firefox5/lib/firefox-5.0/plugins
cp -v libflashplayer.so /opt/firefox5/lib/firefox-5.0/plugins
cp -v ./usr/bin/flash-player-properties /usr/bin/
Add this to the openbox menu.xml file:
<item label="flash player properties">
<action name="Execute">
<execute>
/usr/bin/flash-player-properties
</execute>
</action>
</item>
Download Java from here. OR:
wget http://download.oracle.com/otn-pub/java/jdk/6u26-b03/jdk-6u26-linux-i586.bin
mv jdk-6u26-linux-i586.bin* jdk-6u26-linux-i586.bin
chmod +x /sources/extras/jdk-6u26-linux-i586.bin
cd /dev/shm
/sources/extras/jdk-6u26-linux-i586.bin
cd jdk1.6.0_26
install -v -m755 -d /opt/jdk-6u26
mv -v * /opt/jdk-6u26
chown -v -R root:root /opt/jdk-6u26
ln -v -sf xawt/libmawt.so /opt/jdk-6u26/jre/lib/i386/
cd ..
sed -i 's@XINERAMA@FAKEEXTN@g' /opt/jdk-6u26/jre/lib/i386/xawt/libmawt.so
ln -v -nsf jdk-6u26 /opt/jdk
ln -sv /opt/jdk/jre/lib/i386/libnpjp2.so /opt/firefox5/lib/firefox-5.0/plugins
cat >> /etc/profile.d/30-jdk.sh << "EOF"
# Begin /etc/profile.d/30-jdk.sh

# Set JAVA_HOME directory
JAVA_HOME=/opt/jdk

# Adjust PATH
pathappend ${JAVA_HOME}/bin PATH

# Auto Java CLASSPATH
# Copy jar files to, or create symlinks in this directory
AUTO_CLASSPATH_DIR=/usr/lib/classpath
pathprepend . CLASSPATH
for dir in `find ${AUTO_CLASSPATH_DIR} -type d 2>/dev/null`; do
pathappend $dir CLASSPATH
done

export JAVA_HOME CLASSPATH
unset AUTO_CLASSPATH_DIR
unset dir

# End /etc/profile.d/30-jdk.sh
EOF
source /etc/profile
If you have installed openoffice:
ln -sv /opt/openoffice-3.2.1/program/libnpsoplugin.so /opt/firefox5/lib/firefox-5.0/plugins

Friday 24 June 2011

Parameter Substitution

That is a bloody boring title for a blog post. AAAAAnyway, it means that when you are farting around with variables holding filenames in bash scripts, or single commands, you may want to edit the file name. For instance, remember in the PDF compression script we strip the file extension off the filename and add .pdf to create the output filename. Very useful.

This post was prompted by trying to compress a 1000+ page pdf document. The pain was that the pdfimage command only switches to 4 digit file names at 1000. So for 1-999 you get 001 etc instead of 0001. This has the potential to screw everything up when you come to compile your final pdf. You do not want page 1000 to be before page 200, just because it starts with a lower number.

So how do we insert the missing 0?

for file in imageroot-???.*; do mv $file  imageroot-0${file#imageroot-}; echo $file; done

Friday 17 June 2011

Gnuplot

If you are using the last LTS of Ubuntu, then your Gnuplot version will be out of date. You can't get fancy stuff like transparent surface plots. To download and install the latest version you need to run these commands:

cvs -d:pserver:anonymous@gnuplot.cvs.sourceforge.net:/cvsroot/gnuplot login
cvs -z3 -d:pserver:anonymous@gnuplot.cvs.sourceforge.net:/cvsroot/gnuplot co -P gnuplot
cd gnuplot
./prepare
./configure --with-readline=gnu
make
sudo make install

This will break your Ubuntu packaging system though - so be careful.

Friday 10 June 2011

Ktikz

If you are doing quite a bit of Latex work with the Tikz graphical librarys then you will probably like to use a wysiwyg program so that you can tweak your wonderful creations. Such a program for Ubuntu is ktikz.

To install it you need to install the latest version of TexLive. If using Ubuntu Natty or later then I think you get the latest version by:

sudo apt-get install texlive

With Maverick or earlier, you need to install the latest version manually to get all the bells and whistles:

wget http://mirror.ctan.org/systems/texlive/tlnet/install-tl-unx.tar.gz
tar -xzvf install-tl-unx.tar.gz 
cd install-tl-20110526/
sudo ./install-tl

The install will then run through - takes about an hour to download the stuff. You can also download the DVD image via torrent, which would be a good way to have a backup for quick reinstall. Once installed, you need to make sure you update your path:

PATH=/usr/local/texlive/2010/bin/i386-linux:$PATH

Installing the ktikz software is a bit easier. You just run the following commands to grab the dependencies in case you do not have them. BTW this is for Lucid only. There isn't a package for Maverick or Natty yet, but you could roll your own.

sudo apt-get install build-essential cmake libqt4-dev qt4-dev-tools libpoppler-qt4-dev kdelibs5-dev pgf preview-latex-style
wget http://www.hackenberger.at/ktikz/ubuntu_lucid/ktikz_0.10-1_i386.deb
sudo dkpg -i ./ktikz_0.10-1_i386.deb


You may need to preface some of the scripts with libraries to load:

\usepackage{tikz}
\usetikzlibrary{calc}
\usetikzlibrary{intersections}
\usetikzlibrary{through}

Friday 3 June 2011

Compressing PDF files

If you have got a hold of a PDF file which comprises lots and lots of images and nothing else it may well be huge if the images are not compressed. You can fix this at the command line in ubuntu. You do this as follows:

You will need:

sudo apt-get install pdftk imagemagik

First you need to unpack the images from the PDF. Start this from a blank directory because we are going to automatically do things to all the files in this directory with a specific name.

pdfimages /path/to/filename.pdf imageroot

You get lots of:

imageroot-[three digit number].somethings

These somethings are either a ppm or a pbm filetype. These are very, very, basic graphic image dumps - like a bmp image. One is for texty stuff, and the other is for imagy stuff. I therefore use these filetypes as wildcards in the next command, but you would need to replace these with the correct image types that are generated by this command if your results are different. Such as if you choose to try and output jpeg files by using the [-j] option.

Next you compress each image to a pdf, one page long. This will not work properly if you did not start with a blank directory because we are going to command changes to be made to EVERY file in this directory with the extensions produced by the last command.

For colour sources, you need jpg compression:

for file in *.{ppm,pbm}; do convert -compress jpeg -quality 50 $file  ${file%.???}.pdf; echo $file; done

That applies the commands between the [do] and the [done] bits to all [file]s with the extension of either [{,}] [ppm] or [pbm]. The commands in the middle [convert] the image files into pdf files using [jpeg] [compress]ion with [quality] [50]%. You can obviously change the quality percentage to get the best results depending on your source material. The result is sent to a file whose name is constructed from the input [file] variable [$] less whatever three letter [???] extension [.] it has, plus the characters [.pdf]. The next bit just prints out the last filename processed so you can make sure it is doing something if you are processing LOTS of files.

For black and white sources you need fax compression:

for file in *.{ppm,pbm}; do convert -alpha off -monochrome -compress Group4 -quality 100 $file  ${file%.???}.pdf; echo $file; done

This is basically the same approach as last time, just with a change to the [convert] command. This time the [jpeg] stuff is gone, and we have the [-alpha off -monochrome -compress Group4 -quality 100] bit instead. I can't get the quality setting to do anything here. The Group4 refers to the particular brand of fax compression which is applied.

Finally we take all of those individual pdfs and we combine them into one big one. Again, this will not work properly if you did not start with a blank directory. This copies EVERY pdf which matches the search string (the imageroot*.pdf bit where * means anything) into the final pdf. The classic error here would be making your image root name too similar to your original pdf name, with the result that the built pdf incorporates the original - hardly reducing the file size!

pdftk imageroot*.pdf cat output name_of_final_file.pdf

That uses the [pdf] [t]ool[k]it program to take every [*] file that starts with [imageroot] and ends with [.pdf] and con[cat]enates them into the [output] file named [name_of_final_file.pdf].

Simples.

Friday 27 May 2011

Testing Natty on Nvidia Hardware

The BIG THING about the Natty version of Ubuntu is that it comes with a prettified GUI. It has a pop out panel on the left hand side of the screen containing your launcher icons. This is absolutely NOT the same as Windows 7's task bar icons OR OSX's panel'o'icons because both of them are on the BOTTOM of the screen. Ahem.

Want to see what this looks like with the live cd? Have nvidia hardware? Tough shit. The nvidia driver installed on the live cd is a) open source (yay!) and b) does not support all the bells and whistles that this flashy new panel needs (boo!). So you will have to install the proper nvidia drivers. Which means rebooting. Which is pointless on a live cd.

Oh dear. This is going to be a pain in the fucking arse isn't it? Yes, yes it is.

For a start a couple of useful commands for this kind of fiddling are:

sudo service gdm stop

This stops dead the GUI leaving you to happily fiddle with graphics driver settings.

To restart the GUI you would just restart the gdm service yes? Yes.

sudo service gdm start

OK. So we have shut down the GUI. Now we want to download the nvidia drivers. I am presuming you have internet access from the LiveCD. If not, see my earlier posts on that hilarious situation. Download the drivers with this command:

wget http://uk.download.nvidia.com/XFree86/Linux-x86/270.41.19/NVIDIA-Linux-x86-270.41.19.run

Make the driver file executable:

sudo chmod +x NVIDIA-Linux-x86-270.41.19.run

And run it:

./NVIDIA-Linux-x86-270.41.19.run

You will now get some whining messages about nouveau bollicksing up the whole plan. This is the open source nvidia software we are trying to replace. So lets just [r]e[m]ove the nouveau [mod]ule:

rmmod nouveau

Doesn't fucking work.

sudo rmmod nouveau

Still doesn't fucking work. Google.

sudo rmmod --force nouveau

Oh for fucks sake. It still doesn't fucking work. It turns out the nouveau driver is deeply embedded in the OS. It ain't possible to just remove it. It has taken over something called the framebuffer. Without getting horrendously technical (shorthand for I have no fucking clue) this is the thing that gives you the text command line that you are using. So ubuntu is trying to stop you doing the I.T. equivalent of sawing off the branch you are standing on. If you turn off the framebuffer you will get a blank screen. Which is not great as far as user interfaces go, but still probably a better experience than Windows ME.

At this point you have two choices. First you can disable the in-depth use of nouveau, which will then allow you to remove it without fucking up the framebuffer. This needs to be done at boot time, each and every time you boot the LiveCD. Secondly, you can write a script that forces ubuntu to remove nouveau (fucking up the framebuffer in the process) and then immediately installs the nvidia drivers with the least input necessary.

The benefit with the first option is that you never get left with a blank screen, but it is a pain. The benefit with the second option is that it can all be scripted, but leaves you with a blank screen for several minutes while it installs the drivers. If it fails for any reason during the blank screen period, you're fucked.

So, option one first. Copy your downloaded driver file somewhere safe (USB Key). Or just download it again next time, what do I care. If you are using a LiveCD, reboot your LiveCD. When it starts back up it flashes up a symbol of a keyboard and a hand. Hit any key at that stage and you will get a boot menu. You want to chose to Test Ubuntu. You then want to hit F6 for other boot options.

If you are using the LiveCD image from a USB Key via UNetbootin, then just highlight the "Test..." option and hit the tab key to edit the boot command.

You should now be able to edit the boot command (it is a long string of options. See the joys of grub elsewhere in this blog for info on what this does). At the end of the boot command type in:

nomodeset

Now complete the boot as normal. Copy back, or re-download the nvidia driver to the home directory. Now, hit CTRL+ALT+F1 to get to a text interface, and make sure the driver file is executable:

sudo chmod +x NVIDIA-Linux-x86-270.41.19.run

:and shut down the GUI:

sudo service gdm stop

:and run:

sudo rmmod nouveau

It should now work. You can now install the driver:

sudo ./NVIDIA-Linux-x86-270.41.19.run

And it should eventually install happily. Answer the questions sensibly, ignoring any errors. Then just run this command to restart the GUI:

sudo service gdm start

Bask in the awe of the flashy bar.

Option 2 is handled like this. Boot as normal - no need for any fancy boot commands. Copy back, or re-download the nvidia driver to the home directory. Open a terminal window and paste this code into it to set up your script:

sudo cat > inst_nv.sh << "EOF"
echo 0 > /sys/class/vtconsole/vtcon1/bind
rmmod nouveau
/etc/init.d/consolefont restart
rmmod ttm
rmmod drm_kms_helper
rmmod drm
chmod +x ./NVIDIA-Linux-x86-270.41.19.run
./NVIDIA-Linux-x86-270.41.19.run --accept-license --no-questions --no-nouveau-check --run-nvidia-xconfig --ui=none
service gdm start
EOF
chmod +x ./inst_nv.sh

(What does this script do? Well, the first line is a command which disconnects the frame buffer from the graphics driver. I think. The second command removes the graphics driver. The third command attempts, but I think fails, to restore some sort of text interface. The next three commands remove more unneeded modules connected with the graphics driver. The [chmod] command makes sure that the nvidia file is executable, and the command after that executes it. All the options added onto the execution of the driver package are designed so that it should just install without asking any stupid questions which you will be unable to read. The last command restarts the GUI.)

You now shut down gdm as before from the CTRL+ALT+F1 text interface:

sudo service gdm stop

:and run the script from the text command prompt:

sudo ./inst_nv.sh

Your screen will go blank. Your cd/dvd drive will hopefully spin up and down. After 5 minutes or so, hopefully, you will get a GUI. If it doesn't work, reboot and try option 1.

And hey fucking presto, there is the snazzy new bar.

Admittedly, after all that, it may be a touch underwhelming.

Friday 20 May 2011

Manually Add DNS Servers

My work machine is an Ubuntu machine on a Microsoft Small Business Server network. It gets all of its network settings by way of DHCP from the SBS Server. Sometimes the SBS Server has to be restarted. Most of the software I use can run offline, so that is nor normally a problem.

What is a problem is that I immediately lose internet access because the SBS Server operates as a [DNS]. This is frustrating. Helpfully, Ubuntu has an option to manually add some extra DNS's which SHOULD take over the burden from the SBS Server if it goes down.

To achieve this (in Lucid), go System -> Administration -> Network.

From the program that pops up, select the Network Device drop down menu. Choose your network device. This will be ethX if you are connected by a wired connection. Then click on the Configure button next to the device name. From the new window that pops up, choose the DNS tab. You then need to click on the button between the Help and Close buttons at the bottom of the window, and putin your admin password to allow you to change the settings.

Now, just click on Add, and type in your new DNS IP address.

Hopefully this change means that the next time the server goes offline, I can still use the internet.

Friday 13 May 2011

Testing and Fixing SD Cards

You can test for badblocks in a [n]on-destructive way by running:

sudo badblocks -snp # /dev/sdxy

where # is the number of passes to run. This report[s] the progress to you as it runs.

You can then run:

sudo fsck.vfat /dev/sdxy -a -w

This [a]utomatically fixes problems by [w]riting changes to the disk. Any dodgy files you had will be renamed.

Finally, if you have cleared the disk of any remaining stuff, you can run a deep [w]rite and read test by running:

sudo badblocks -swp # /dev/sdxy

If you do not find any physical bad blocks, that suggests that the problem was a software one, and the card remains safe to use.

Friday 6 May 2011

Making a Natty Live CD

This is further update of my Maverick and Lucid instructions.

I am assuming that you have got to a terminal window by following the start of the Lucid instructions either on a Windows or Ubuntu system.

First of all we are going to need some extra packages for our running system to manipulate the CD image. To install these, run this command from a terminal window.

sudo aptitude install squashfs-tools genisoimage

The CD image contains one large archive file which stores all of the disk environment for the LiveCD. The squashfs-tools handles this archive. Basically, we unpack everything, add our extra packages, and then repack everything. The genisoimage [gen]erates the final [iso] [image] which we put onto the USB Key in due course.

First, make a working directory in [~] home:

mkdir ~/live
cd ~/live

Next we are going to actually mount the CD image so we can use it as if we had burned it to a CD and inserted it. We need to create the mount point first of all:

mkdir mnt
sudo mount -o loop /media/[whatever]/ubuntu-11.04-desktop-i386.iso mnt

The [whatever] will differ depending on exactly where your system mounts the USB Key you just plugged in. The loop [o]ption allows us to mount the image file as a folder.

Now we need to copy all of the files off the mounted image APART from the large squashed file. Again we want a separate folder to store these files:

mkdir extract-cd
rsync --exclude=/casper/filesystem.squashfs -a mnt/ extract-cd

Next we need to extract the big archive file:

sudo unsquashfs mnt/casper/filesystem.squashfs

It uncompresses to a folder name we want to change by the typical linux method of [m]o[v]ing it.

sudo mv squashfs-root edit

We are going to be chrooting into the filesystem we just unpacked, and we want to use some of the files on our existing machine to point the way to the internet, and to give us access to our current hardware:

sudo cp /etc/resolv.conf edit/etc/
sudo cp /etc/hosts edit/etc/
sudo mount --bind /dev/ edit/dev

Now we chroot in and mount some virtual filesystems:

sudo chroot edit
mount -t proc none /proc
mount -t sysfs none /sys
mount -t devpts none /dev/pts

We also need to set some system variables and create a symbolic link for some reason or another.

export HOME=/root
export LC_ALL=C
dbus-uuidgen > /var/lib/dbus/machine-id
dpkg-divert --local --rename --add /sbin/initctl
ln -s /bin/true /sbin/initctl

I think the dbus command generates a code for this specific machine that some installation stuff may need. No idea what the [initctl] stuff is all about.
Excellent. We are now at the point where we can start to install stuff. First of all, and this was fucking frustrating trying to work this out, we need to enable the universe and multiverse repositories if we want to install packages from them. We need to do this in command line:

sudo nano /etc/apt/sources.list

Let's add the PPA's that I covered in a previous post so we can install the useful software they include:

sudo add-apt-repository ppa:stebbins/handbrake-releases
sudo add-apt-repository ppa:jd-team/jdownloader

This next command adds an external repository (which is a bigger and more complex type of PPA) to the list of places we can download stuff from. It's a biggie but it basically is just a series of commands that run in sequence.

sudo wget http://www.medibuntu.org/sources.list.d/`lsb_release -cs`.list --output-document=/etc/apt/sources.list.d/medibuntu.list; sudo apt-get -q update; sudo apt-get --yes -q --allow-unauthenticated install medibuntu-keyring; sudo apt-get -q update

Before installing stuff, it might be a good idea to clean out some stuff we are not going to use. First of all have a look at all the installed packages in order of size:

dpkg-query -W --showformat='${Installed-Size} ${Package}\n' | sort -nr | less

The first on the list looks like a massive package, but what you have to understand is that this is showing you the UNCOMPRESSED sizes. Yeah, thanks for that. If you check on http://packages.ubuntu.com/lucid for the ubuntu-docs information, you find it is only taking up a few hundred Kb when squashed.
So what can you remove? Evolution is a prime candidate. Not much use on a LiveCD. You will be using webmail from a LiveCD.

dpkg-query -W --showformat='${Installed-Size} ${Package}\n' | sort -nr | grep evolution

This will show you all packages which have evolution in the title. It should look like this:

9824 evolution-common
5560 libevolution
2148 evolution-exchange
1580 evolution-data-server
1084 evolution
764 evolution-plugins
380 evolution-data-server-common
124 evolution-indicator
88 evolution-webcal

You can remove all, apart from evolution-data-server-common which is needed by other applications, by running this command:

apt-get remove --purge evolution-common libevolution evolution-exchange evolution-data-server evolution-webcal evolution evolution-plugins evolution-indicator evolution-webcal

The language packs also take up a lot of space, and I do not need anything but English. Find these by running:

dpkg-query -W --showformat='${Installed-Size} ${Package}\n' | sort -nr | grep language-

and then remove the ones we do not want:

apt-get remove --purge language-pack-gnome-xh-base language-pack-xh-base language-pack-gnome-zh-hans-base language-pack-zh-hans-base language-pack-gnome-es-base language-pack-gnome-pt-base language-pack-gnome-de-base language-pack-pt-base language-pack-es-base language-pack-de-base

There are other language packs, but they are to small to worry about clearing up unless you are intent on getting this image as small as possible. The final obvious low hanging fruit are foreign font sets. Do a search for [t]rue[t]ype[f]ont packages:

dpkg-query -W --showformat='${Installed-Size} ${Package}\n' | sort -nr | grep ttf

12580 ttf-unfonts-core - Korean
6196 ttf-takao-pgothic - Japanese
5456 ttf-thai-tlwg - Thai
5184 ttf-wqy-microhei - Chinese
4204 ttf-freefont - Latin, keep
2628 ttf-indic-fonts-core - Indian
2592 ttf-dejavu-core - Latin, keep
2336 ttf-ubuntu-font-family - These are new to Natty, and are used widely in ubuntu, so best keep them.
1724 ttf-liberation - Latin, keep
592 ttf-khmeros-core - Cambodian
500 ttf-opensymbol - Symbols, needed for OpenOffice, keep
220 ttf-punjabi-fonts - Punjabi
144 ttf-lao - Lao, where ever Lao is
116 ttf-kacst-one - Arabic

apt-get remove --purge ttf-unfonts-core ttf-takao-pgothic ttf-thai-tlwg ttf-wqy-microhei ttf-indic-fonts-core ttf-khmeros-core ttf-punjabi-fonts ttf-lao ttf-kacst-one

I am not going to be printing from the LiveCD so I can remove the software and drivers as follows. Do not worry about the ubuntu-desktop meta file - it is just an easy way of ensuring that all the ubuntu basic stuff is installed. As soon as printing is removed, the installation no longer qualifies as an ubuntu desktop but it doesn't actually remove the rest of the stuff.

apt-get remove --purge cups hplip-data libgutenprint2

I am also going to remove banshee, the rhythmbox replacement, because it does take up a lot of space, and like evolution it is really the type of application you need to set up on an installed machine rather than running from a LiveCD. I have no intention of playing sudoku.

apt-get remove --purge banshee gnome-sudoku

Even after all of those deletions, once I recompressed the image I found I had saved a paltry 90Mb or so. Ho hum. Once you have carried out all the removals, a good strategy is to upgrade all your remaining packages to the latest versions. You do NOT want to upgrade the Kernel or Grub because that causes bad things to happen and will stop the Live USB stick booting. So, create the following files to 'pin' those packages to their current versions:

sudo cat > hold_back_kernel << "EOF"
Package: linux-generic linux-headers-generic linux-image-generic linux-restricted-modules-generic
Pin: version 2.6.38.8.22
Pin-Priority: 1001
EOF
sudo mv hold_back_kernel /etc/apt/preferences.d/
sudo cat > hold_back_grub << "EOF"
Package: grub-common
Pin: version 1.99~rc1-13ubuntu3
Pin-Priority: 1001
EOF
sudo mv hold_back_grub /etc/apt/preferences.d/
sudo apt-get update

You can check that the version numbers are correct (they should be for the Live CD you have downloaded) and check the system knows about the new rules by running these commands:

sudo apt-cache policy
dpkg -l linux-generic
dpkg -l grub-common

The last few lines of all of that should look like this (you can see the version numbers match up):

Pinned packages:
     linux-headers-generic -> 2.6.38.8.22
     linux-image-generic -> 2.6.38.8.22
     grub-common -> 1.99~rc1-13ubuntu3
     linux-generic -> 2.6.38.8.22
root@Dellbuntu:/# dpkg -l linux-generic
Desired=Unknown/Install/Remove/Purge/Hold
| Status=Not/Inst/Conf-files/Unpacked/halF-conf/Half-inst/trig-aWait/Trig-pend
|/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad)
||/ Name                                          Version                                       Description
+++-=============================================-=============================================-==========================================================================================================
ii  linux-generic                                 2.6.38.8.22                                   Complete Generic Linux kernel
root@Dellbuntu:/# dpkg -l grub-common
Desired=Unknown/Install/Remove/Purge/Hold
| Status=Not/Inst/Conf-files/Unpacked/halF-conf/Half-inst/trig-aWait/Trig-pend
|/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad)
||/ Name                                          Version                                       Description
+++-=============================================-=============================================-==========================================================================================================
ii  grub-common                                   1.99~rc1-13ubuntu3                            GRand Unified Bootloader, version 2 (common files)

You should now be able to run the upgrade excluding those troublesome packages.

sudo apt-get upgrade

We can now run a massive install command to add the extra packages that we want.

sudo apt-get install \
\
libgtk2.0-dev bison texinfo \
\
flashplugin-installer openjdk-6-jre icedtea6-plugin \
\
libreoffice-java-common libreoffice-l10n-en-gb libreoffice-help-en-gb openoffice.org-hyphenation-en-us openoffice.org-thesaurus-en-us \
\
ttf-mscorefonts-installer \
\
smplayer vlc avidemux audacity \
\
totem-plugins-extra gstreamer0.10-pitfdll gstreamer0.10-ffmpeg gstreamer0.10-plugins-bad gstreamer0.10-plugins-bad-multiverse gstreamer0.10-plugins-ugly gstreamer0.10-plugins-ugly-multiverse \
\
non-free-codecs libavcodec-extra-52 libdvdcss2 libdvdread4 libdvdnav4 \
\
lame mjpegtools twolame mpeg2dec liba52-0.7.4-dev ffmpeg ffmpeg2theora w32codecs \
\
keepassx handbrake-gtk jdownloader \
\
xchm comix gqview pdfmod \
\
wine \
\
transmission \
\
dcraw gimp gimp-data-extras gimp-help-en \
\
celestia celestia-common celestia-common-nonfree stellarium googleearth-package lsb-core \
\
git-core subversion libssl-dev \
\
monodevelop

To install Google Earth from the package downloaded above, you run:

make-googleearth-package --force

You then need to install the package the previous command creates. At the time of writing the version numbering creates a package which can be installed with this command (but this is likely to change):

sudo dpkg -i googleearth_6.0.2.2074+0.6.0-1_i386.deb

I also want a bit of software which is not available in the repositories. Truecrypt is a handy encryption system. Truecrypt is a pain in the arse since they want you to accept their stupid licence instead of just releasing under the GPL.

cd /tmp
wget http://www.truecrypt.org/download/truecrypt-7.0a-linux-x86.tar.gz
tar -xzvf truecrypt-7.0a-linux-x86.tar.gz
./truecrypt-7.0a-setup-x86

Extract the package file - it automatically stores it in /tmp. We need to extract it to /, and it auto installs to the correct folders. So:

cd /
tar -xzvf /tmp/truecrypt_7.0a_i386.tar.gz

And that is that for Truecrypt.

If you were stupid enough to upgrade the kernel package, it is extremely likely that doing a upgrade of all the packages will change the kernel version. To ensure that the new kernel is actually used, you need to go into the ...
~/live/edit/boot
...folder and copy the latest versions of the vmlinuz compressed kernel and the initrd.img files to the ...
~/live/extract-cd/casper
...folder. You then need to delete the existing initrd.lz file, and rename the initrd.img file you just copied over to replace it. Do the same with the vmlinuz files. This has NEVER worked for me so I do not upgrade the kernel package.

If you want the clock in the LiveCD machine to show the proper time, take a moment and set your time zone and keyboard for UK use:

sudo setxkbmap gb
sudo cp -v --remove-destination /usr/share/zoneinfo/Europe/London /etc/localtime

Of course, the keyboard command does not fucking work. I now have no fucking idea how to change the keyboard map from the command line. Brilliant. I mean there must be a file somewhere, anywhere, that actually stores these settings. Where? No fucking clue. Moving on:

We now need to clean up some user account stuff incase any of the installed packages made changes:

awk -F: '$3 > 999' /etc/passwd
usermod -u 500 $hit #where hit is any user ID greater than 999

Right, and we are good to do a general clean up:

apt-get clean
rm -rf /tmp/* ~/.bash_history
rm /etc/resolv.conf
rm /var/lib/dbus/machine-id
rm /sbin/initctl
dpkg-divert --rename --remove /sbin/initctl
umount -l /proc
umount /sys
umount /dev/pts
exit
sudo umount edit/dev

The [rm] commands obviously [r]e[m]ove stuff, and the remaining commands undo the setup commands we used to get into the [chroot] environment. Virtually every time I do this I get a fucking annoying error telling me that it can't umount these things because they are in use. Well, you know what I say to that? Hello Mr. Reboot.

Once the system has come back on, fire up a terminal and:

cd ~/live

Now, update the .manifest file which is a list of installed packages:
chmod +w extract-cd/casper/filesystem.manifest
sudo chroot edit dpkg-query -W --showformat='${Package} ${Version}\n' > extract-cd/casper/filesystem.manifest
sudo cp extract-cd/casper/filesystem.manifest extract-cd/casper/filesystem.manifest-desktop
sudo sed -i '/ubiquity/d' extract-cd/casper/filesystem.manifest-desktop
sudo sed -i '/casper/d' extract-cd/casper/filesystem.manifest-desktop

Now delete the existing squashed archive and replace it. Don't worry if it cannot find one, there shouldn't be one the first time you run through these instructions.
sudo rm extract-cd/casper/filesystem.squashfs
sudo mksquashfs edit extract-cd/casper/filesystem.squashfs

That last command will take most time of anything here, as it (re)compressess all the packages we want. If you are feeling particularly narcisstic you can edit the disk details (change the image name) that pop up on boot:

sudo nano extract-cd/README.diskdefines

Now we need to rebuild the md5sum check file:
cd extract-cd
sudo rm md5sum.txt
find -type f -print0 | sudo xargs -0 md5sum | grep -v isolinux/boot.cat | sudo tee md5sum.txt

That will probably take two [sudo] password requests - one for the straight [sudo] and one for the [| sudo] piped version. Do not know why. Irritating as fuck.

And finally we need to build a new iso image:

sudo mkisofs -D -r -V "$IMAGE_NAME" -cache-inodes -J -l -b isolinux/isolinux.bin -c isolinux/boot.cat -no-emul-boot -boot-load-size 4 -boot-info-table -o ../ubuntu-11.04-desktop-i386-custom.iso .

Once you have that image you use the unetbootin software (which comes in both windows and linux flavours) to load the image onto the USB Key. You need to use version 494 at least to get it to work with Natty. Job done.

Friday 29 April 2011

Adding a PPA in Maverick

Happy day for those ubuntu users who want the video transcoding application Handbrake. I find this useful because it is the only application that will spit out video that an iPhone or iPad will actually play. Anyway, my live cd instructions contained a breathless explanation of how to build handbrake from source. You no longer need to do that. You just need a ppa instead.

OK, what's a PPA? It is like an official ubuntu source of packages, but it is not maintained by canonical. It stands for personal package archive. These are useful for all kinds of things. Here it is used for software not tested by the people who make ubuntu, and which doesn't form part of the ubuntu repositories of packages.

In addition, a useful download manager is also now available for ubuntu in a ppa: jdownloader. What we are about to do is add two ppa's to our list of sources of ubuntu applications, then update our list of what applications are contained in all of our sources, and then we will install the two applications in question.

sudo add-apt-repository ppa:stebbins/handbrake-releases
sudo add-apt-repository ppa:jd-team/jdownloader
sudo apt-get update
sudo apt-get install handbrake-gtk jdownloader

Friday 22 April 2011

Batch flv to mp4 conversion

If you have been happily downloading and storing flash videos you may end up with a whole pile of flv files. My instructions above included a command to convert each file into an mp4 file which is much easier to work with.

What about converting a whole folder full at once?

Try this:

for file in *.flv ; do ffmpeg -i $file -acodec copy -vcodec copy ${file%.flv}.mp4 ; done

That three stage command does the following (in english):

The first bit before the semi colon means, [for] each [*] file that you find in the current folder with the extension [flv], take the full name of the file forward into the next part of the command as a variable [file]. The next bit says that you [do] the command [ffmpeg] using as an [i]nput the variable [$] called [file]. The options for the ffmpeg command are to [copy] the [a]udio [codec] and the [v]ideo codec, so no recompression. You retain the same data but in a different container. The output of the command is a file with a name made up of the variable [$] called [file] minus [%] the characters [.flv] PLUS the characters [.mp4]. The last bit after the second semi colon closes the command sequence started by the [do]. If you felt brave you could stick a [r]e[m]ove command in here to get rid of your input files in before the [; done].

Friday 15 April 2011

Listening to BBC Radio Shows on an MP3 player

In the name of bastard fuck, this was far fucking harder than it had any right to be.

You have Windows Media Centre installed on your PC. Either XP, or Vista or 7. Whatever. Fine. You record DVB-T1 broadcast TV programs and watch them quite happily. You record the odd radio program (also broadcast over DVB-T1), and you can play it back on the TV. But who wants to sit around the TV listening to radio programs, what do you think this is, the 1930's?

You have recorded an audio file. You want to listen to the audio on another device. How fucking difficult do you think this is going to be? Oh, boy you are in for a fucking treat.

So, how do we do this? I should interject here and say that there is a fairly straightforwards, albeit time-consuming, way to deal with this. You fire up audacity, change the input source to be the sound card audio out, then hit record and play whatever it is you want to record. The problem with this is that you are recording a compressed track so you are going to lose quality. In my view the BBC are the experts when it comes to digital audio compression and trasmission, and it is going to spoil whatever they broadcast if I recompress it. Its like sitting in the back row of the cinema with your digital video camera. You are not exactly going to walk out with a blue ray quality version of the film are you?

First thing. If using Vista or Windows 7, media centre spits out files with the extension .wtv. These are unreadable by anything but windows, so we need to convert them into a more common format. Helpfully if you right click on the file you will find an option to convert it to dvr-ms file.

What I would then ordinarily do is fire the file up in Video Redo. This is an excellent product for taking in a dvr-ms file, cutting out all the bits you do not want (ads and so forth) and then spitting out the original video and audio streams in an mpg file. You can then happily load that mpg file into avidemux, or your editor of choice and convert it to whatever format you desire.

However, Video Redo shits itself when it is given a dvr-ms file without any video. It just cannot handle it. The more recent versions of Video Redo may be different, but I am not paying for an upgrade for something this simple.

So we want to extract the audio in some other fashion. When something is difficult, then make it harder by doing it on the command line. If we want robust command line editing tools then we want Linux. I have tried ffmpeg under ubuntu with a command that looks a bit like this:

ffmpeg -i filename.dvr-ms -acodec copy -o filename.mp2

Doesn't fucking work. Apparently ffmpeg also cannot handle the earth shattering reality of a dvr-ms file without a fucking video stream.

How about:

mplayer -dumpaudio -dumpfile audio.mp2 filename.dvr-ms

No fucking joy. Cannot ever find an audio stream. It turns out that vlc is practically the only fucking thing on the planet that will spit the untouched audio out of a BBC Digital TV Broadcast Radio Recording. The command is this gem:

cvlc "[filename].dvr-ms" vlc://quit --sout '#transcode{vcodec=none}:duplicate{dst=std{access=file,mux=raw,dst="[filename].mp2"}}'

That's a fucking little beauty isn't it. Far more than you could ever realistically want to know about those options can be found here and here.

Friday 8 April 2011

How to Post Maths in Blogspot: MathJax & QuickLatex

Edit your template and add the following before the closing "head" tag:

<script type=\"text/x-mathjax-config\">
MathJax.Hub.Config({
  MMLorHTML: {
    prefer: {Firefox: \"HTML\"}
  }
});
</script>
<script type=\"text/javascript\" src=\"http://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML-full\"></script>

Then you just stick the TeX stuff in backslash square brackets for standalone formulae or backslash rounded brackets for in text symbols.

To get graphics you need to use Quicklatex. You need to drop down the choose options box on that page and then insert this text to activate the Tikz graphics package, with the tikz library that lets you calculate co-ordinates using algebra:

\usepackage{tikz}
\usetikzlibrary{calc}

You then get the URL of an image which you can stick in your blog.

Friday 1 April 2011

Can I Exchange this for something that works?

OK, so they you are running Ubuntu on your work PC. You can connect to the office Exchange server using Evolution so you can get all your email. You are doing all of this using open source software. You are most likely feeling pretty smug.

Eventually you will want to leave the office, so you want to fire on the out of office notifications in Evolution. You read about how to do so. You find out you need to follow this chain of menus and buttons:

Edit -> Preferences -> Edit -> Account Editor -> Exchange Settings -> I am out of the office.

However, at the second "Edit" you get a "Evolution Error" window informing you that the lovely open source program you were oh so smug about "could not read out-of-office state"

Bastard. The problem is that if you google for help on this error, the search terms "state" and "evolution" tend to pollute the results with results about the particular subset of Americans who could politely be called reality-challenged. They could also impolitely be described as the type of fucking cretins who would happily watch their friends and family combusting in agonising pain if their sacred text of choice said that humans were impervious to fire.

Even searching in the Ubuntu Forums, I cannot find an answer to this. Or, come to it, why even though I have selected Global Contact List for Auto Completion in the settings, it doesn't fucking auto complete email addresses from the Global Contacts List.

Friday 25 March 2011

A Backup Scheme Described

I wrote a bit about my backup routine in my rant about TrueImage. I wanted to set out in more detail the scheme I have set up because it also incorporates some mklinking referred to in my previous post.

My previous example of mklinking dealt directly with steam applications. There is another type of data that you do not want to have clogging up and SSD. That is any large file that you only require to access sequentially. Or in non-computer science lingo, music, videos and photos. These can be large files (a modern SLR camera will happily spit out 10Mb image files), but you do not need the super fast access time of an SSD to use them properly.

Think about it, you could happily watch a movie from a DVD player. That has a PATHETIC data rate and seek time compared to an SSD, but it works fine for movie file. Movie files are also LARGE. My Sanyo camcorder mentioned elsewhere spits out files up to 20mins/Gb. They take up a lot of space. If a 64Gb SSD (with usable space of say 60Gb) was used exclusively for these video files you would fit no more than 20 hours of video on it. For £60. That's £3/hour for storage of video. That makes no sense. Not when at the time of writing you can pick up a 2Tb classic drive for the same £60. OK, that 2Tb drive is only actually about 1.8 real Terabytes. Still, that's 620 hours of video for £60. That's £0.09/hour.

So for large data files we want the 9p slow and steady storage, not the £3 super fast storage. What I put on the classic disk is basically the contents of My Videos, My Music, My Pictures, and My Documents. To do this I create a directory on the classic disk with appropriately named folders in it. If your OS is on the SSD, then your classic disk is probably mounted at d:. You can do this with the mouse and gui, but the dos prompt commands (windows key + r, type cmd, hit enter) look like this:

cd d:\
mkdir Data
cd Data
mkdir Videos
mkdir Music
mkdir Pictures
mkdir Documents

For the benefit of the hard of thinking, those commands [c]hange to the [\] root [d]irectory of the [:] disk assigned the letter [d]. They next [m]a[k]e a new [dir]ectory, [c]hange into that new [d]irectory, and [m]ake some other [dir]ectories.

You then fill up those folders with the data of choice. Copy this from your non-SSD old system drive, or from your up to date back up. Then, if using windows 7, you need to delete the Videos, Music, Pictures, Documents from your user folder. There should be nothing in those folders on a fresh install - but check anyway, and if there is then move it to the folders on the classic disk we just made.

cd C:\Users\[your username here]
xcopy Videos D:\Data\Videos /E /H /K
rmdir Videos /S
xcopy Music D:\Data\Music /E /H /K
rmdir Music /S
xcopy Pictures D:\Data\Pictures /E /H /K
rmdir Pictures /S
xcopy Documents D:\Data\Documents /E /H /K
rmdir Documents /S

[xcopy] is a special dos command that is used to copy large collections of files at once. It is basically more flexible and powerful than the standard dos [copy] command. For our purposes it is exactly the same as Control+c'ing a folder and Control+v'ing that folder somewhere else. The [/E] flag tells it to copy all subfolders even if they are empty, the [/H] flag makes sure it copies hidden or system files, and the [/K] flag copies the attributes of the files being copied (read only etc) as well. The other command [r]e[m]oves the specified [dir]ectory and all its [/S]ubdirectories.

Finally you want to set up the mklinks:

mklink /J "C:\Users\[your username here]\Documents" "D:\Data\Documents"
mklink /J "C:\Users\[your username here]\Videos" "D:\Data\Video"
mklink /J "C:\Users\[your username here]\Pictures" "D:\Data\Pictures"
mklink /J "C:\Users\[your username here]\Music" "D:\Data\Music"

You should now be up and running. If you go to the start button and click on Music, the folder that should open is the one on the classic disk. The benefit to using mklink is that it appears to the OS and all software running on it (such as iTunes) that all this data is in your user folder on the SSD, so you do not need to specially configure any other software. It should just work.

Now you have set that up, backups are a doddle. Buy a large external 3.5" disk that comes with it's own power supply. Yes, it is bulky. Yes it is a pain to have to carry around a data and power cable. BUT it is much cheaper per Mb than a 2.5" USB powered external disk. And all it is going to do is sit in the vicinity of your main machine ready to be used as a backup device at a moments notice. You then just need Allway Sync, or a similar folder syncing application, to sync the d:\data folder onto your external drive. Make sure you sync that ONE folder every so often (i.e. when you have bought new iTunes music, written a new document, or uploaded a camera card to your desktop) and you have a backup copy of ALL of your data.

I find the advantages of this system are:

Simplicity. The backup is just a copy of all your data files. You can plug the external disk into another machine running virtually any desktop OS and your have instant access to your files.
Cost. Folder Synchronisation software is inexpensive (Allway Sync is nagware).
Speed. All I do is open Allway Sync, choose the "Data" profile I have set up, run an analysis just to check, and then run a sync. It does not take a long time unless you have added a crap load of stuff.
Flexibility. Running dual OS's? Simply link Ubuntu's similarly named data folders to the ones on the windows disk. Hey presto, same data, two different OS's.

There are disadvantages to this system:
Versioning. The backup is a simple mirror of your files. It does not keep multiple copies of your files, so you cannot undo a change, or deletion, once you have synced. You could implement this, but you would need more expensive dedicated backup software. As far as I am concerned, I do not want to lose family media. I am not overly exercised about restoring last Tuesday's copy of a 1Gb video file rather than last Wednesdays. Generally the data in these folders is NOT going to change, it is going to be added to or deleted, but not altered.
Onsite. The backup disk is in the same room as the desktop with the original data, which is not going to help if the equipment is stolen, or the building burns down. For that protection, you need to pay for a online backup solution - or get an ISP who provides one.

I prefer the advantages to the disadvantages. I do see a need for a versioning backup system, which is why I bought TrueImage Home. But I use that pretty exclusively for the OS disk. With that you absolutely want to be able to go back to last Tuesday, or "whenever I didn't have this bastard virus".

You may also want to move some other folders from your user folder on the SSD to the classic disk. For me a priority was the Downloads folder. You delete the original folder as normal and then:

mkdir d:\Data\Downloads
mklink /J "C:\Users\[your username here]\Downloads" "D:\Data\Downloads"

You probably do not want to backup the downloads folder - I use mine as a scratch area for downloading loads of rubbish. If I want to keep stuff I download, I shift it to an appropriate data folder. So you will now want to edit the profile you have set up in Allway Sync to EXCLUDE the Downloads folder from the backup. This saves space on your external disk.

You can also get cute with the exclusion filters INSIDE the folders you are syncing. So if you have a Temp scratch area in Music or Videos, you can exclude that from the backup as well.