This one predicts (ok, procedurally generates) text word-by-word.  Others do it letter-by-letter, but then you have to deal with misspellings.  src. Inspiration.

Dependencies

I'm basing this off an installation script referenced in the github readme (here).  Instead of blindly running the script, I checked its contents and discovered it fails when run on linux mint.  So I simplified it, and changed/added a thing or two since it wasn't working anyway.  In case anyone wants to blindly run my version, here it is all cleaned up.

Updating your sources is always a good idea prior to installing a bunch of software.  Also, the package below makes it easy to add new repositories (see below).
sudo apt-get update
sudo apt-get install -y python-software-properties
Now things are getting serious.  This stuff includes GCC 4.9, since it seems GCC 5 isn't compatible with Torch 7.  There was a weird error where cmake wasn't installed when I put all the packages in the same apt-get install command - I split out the packages to their own lines, ran my install script again, and it worked.  Very odd.  Hence, my inefficient code below.
sudo apt-get install -y build-essential 
sudo apt-get install -y gcc
sudo apt-get install -y g++
sudo apt-get install -y curl
sudo apt-get install -y cmake
sudo apt-get install -y libreadline-dev
sudo apt-get install -y git-core
sudo apt-get install -y libqt4-core
sudo apt-get install -y libqt4-gui
sudo apt-get install -y libqt4-dev
sudo apt-get install -y libjpeg-dev
sudo apt-get install -y libpng-dev
sudo apt-get install -y ncurses-dev
sudo apt-get install -y imagemagick
sudo apt-get install -y libzmq3-dev
sudo apt-get install -y gfortran
sudo apt-get install -y unzip
sudo apt-get install -y gnuplot
sudo apt-get install -y gnuplot-x11
sudo apt-get install -y ipython
sudo apt-get install -y libpcre3-dev
These are for a program called OpenBLAS, which stands for Basic Linear Algebra Subprograms.  Here's a bit more on the concept.
sudo apt-get install -y libopenblas-dev liblapack-dev
Install Torch (finally).  Note that when running the ./install.sh command, I'm letting bash automatically enter "Y" when prompted for a user response.  It's requesting permission to add torch to the system path in the .bashrc: essentially, creating a system shortcut for each initialization of the bash terminal.
git clone https://github.com/torch/distro.git ~/torch --recursive
cd ~/torch;
echo "Y" | ./install.sh
source ~/.bashrc
These tools are installed with Lua, a programming language that came with Torch.
/home/$USER/torch/install/bin/luarocks install nngraph
/home/$USER/torch/install/bin/luarocks install nninit
/home/$USER/torch/install/bin/luarocks install optim
/home/$USER/torch/install/bin/luarocks install nn
/home/$USER/torch/install/bin/luarocks install underscore.lua --from=http://marcusirven.s3.amazonaws.com/rocks/
sudo /home/$USER/torch/install/bin/luarocks install lrexlib-pcre PCRE_DIR=/lib/x86_64-linux-gnu/ PCRE_LIBDIR=/lib/x86_64-linux-gnu/
Make sure to add Torch to your PATH.  This ensures that when you run the command to use Torch, your computer can actually find the program.
to_path="/home/$USER/torch/install/bin"
echo "PATH=$PATH:$to_path" >> /home/$USER/.bashrc
source ~/.bashrc
With that, all dependencies should be satisfied. Time to install.

Installation

Clone the software from github.
mkdir ~/projects
cd projects
git clone https://github.com/larspars/word-rnn.git
That's actually all there is to it.  Now cd into the word-rnn directory to run the test stuff.  Before the tests and tools, though, there's a fix that you have to perform.

Extra Fixes

I'm running torch on a gpu-less computer.  There's a glitch that occurs when running the test script in that scenario.  To avert it, you have to change the name of a function from CudaTensor to Tensor.  See here for details.  
cd word-rnn/util
nano SharedDropout.lua
The third line should look like this:
SharedDropout_noise = torch.CudaTensor()
Change it to this:
SharedDropout_noise = torch.Tensor()
And save.
CTRL + O
CTRL + X

Running word-rnn

Now you can run the test function.  Be aware that on a 4-core 2.8GHz i7 processor, this command took 21 hours to complete.  BUT it's a very cool command, and the result is amazing.
th train.lua -gpuid -1
To be continued...