One metric to determine how moved I am is when I have set up my server. I have two servers and two Reel-to-Reel machines. Generally my office set-up includes both of these. I basically took them down the week of June 9th when we bought our house in Eugene. Now 244 days later, I was able to turn my MacMini server back on. It has been a while. Hello MacMini - Now just to get you up and running with the latest software...
Category Archives: Web Server
Dreamhost Troubles
More recent graph taken on 23. December 2013.
Graph from 22. December 2013
Network Language Documentation File Management
This post is a open draft! It might be updated at any time… But was last updated on at .
Meta-data is not just for Archives
Bringing the usefulness of meta-data to the language project workflow
It has recently come to my attention that there is a challenge when considering the need for a network accessible file management solution during a language documentation project. This comes with my first introduction to linguistic field experience and my first field setting for a language documentation project.The project I was involved with was documenting 4 Languages in the same language family. The Location was in Mexico. We had high-speed Internet, and a Local Area Network. Stable electric (more than not). The heart of the language communities were a 2-3 hour drive from where we were staying, so we could make trips to different villages in the language community, and there were language consultants coming to us from various villages. Those consultants who came to us were computer literate and were capable of writing in their language. The methods of the documentation project was motivated along the lines of: “we want to know ‘xyz’ so we can write a paper about ‘xyz’ so lets elicit things about ‘xyz'”. In a sense, the project was product oriented rather than (anthropological) framework oriented. We had a recording booth. Our consultants could log into a Google-doc and fill out a paradigm, we could run the list of words given to us through the Google-doc to a word processor and create a list to be recorded. Give that list to the recording technician and then produce a recorded list. Our consultants could also create a story, and often did and then we would help them to revise it and record it. We had Geo-Social data from the Mexican government census. We had Geo-spacial data from our own GPS units. During the corse of the project massive amounts of data were created in a wide variety of formats. Additionally, in the case of this project language description is happening concurrently with language documentation. The result is that additional data is desired and generated. That is, language documentation and language description feed each other in a symbiotic relationship. Description helps us understand why this language is so important to document and which data to get, documenting it gives us the data for doing analysis to describe the language. The challenge has been how do we organize the data in meaningful and useful ways for current work and future work (archiving)?People are evidently doing it, all over the world… maybe I just need to know how they are doing it. In our project there were two opposing needs for the data:
- Data organization for archiving.
- Data organization for current use in analysis and evaluation of what else to document.It could be argued that a well planned corpus would eliminate, or reduce the need for flexibility to decide what else there is to document. This line of thought does have its merits. But flexibility is needed by those people who do not try to implement detailed plans.
SSH and Terminal
I used an ssh connection from the Terminal today for the first time!
I feel like a real man now.I needed to transfer a 106MB folder from one subdomain to another subdomain on my DreamHost webserver. It has been my experience that whenever I copy or move folders with a lot of sub-folders that something(s) do(es) not get copied all the time or all the way. So I needed to archive my files and move them as a single object. But I do not think it is possible to zip files with an FTP client (at least not with Interarchy). For a solution I turned to ssh and a lot of googling.
So to ssh into my webhost I had to enable a user from the DreamHost panel.
Second image from another tutorial.Then I had to open terminal and create a key. I found some sensible directions in the knowledge base.
To generate a secure public/private key pair to log in securely, and without a password (if you want):
- In Terminal type:
ssh-keygen -d
Hit the “enter” key three times.
Replacing “username” and “yourdomain” with your FTP username and your-domain,
- copy & paste/type the following into Terminal:
ssh username@ftp.yourdomain.com 'test -d .ssh || mkdir -m 0700 .ssh ; cat >> .ssh/authorized_keys && chmod 0600 .ssh/*' < ~/.ssh/id_dsa.pub
Press return/enter key again.
Wait for it to ask for the Password:Enter the password of the FTP user who's username you inserted in place of the example USERNAME@ftp.yourdomain.com above.
If it asks you for the password multiple times, type in the same correct password each time.Then you will be at the root in your Terminal window.
- type:
ssh username@ftp.yourdomain.com
You're logged in!
Now any time you want to log using SSH you can just repeat
ssh username@ftp.yourdomain.com
from the command line (Terminal), no need to repeat the other steps.
So from here on I was in my webhost but still didn't know how to get around. Evidently I needed to use long paths so $ cd /home/username/directory
would move me from directory to directory. I could not just $ cd /directory
.
Once I was able to get to the directory I needed to archive, I still needed the archive commands.
I thought I wanted to use zip as my archive utility. The zip command to do that would be:
$ zip -r folder.zip folder
Though my friend Daniel said that I might should have used tar gunzip tar.gz
instead of using the zip command: "Zip compresses each file separately and then archives. Tar+gzip or tar+bzip2 archives first and then compresses."
The commands to use the tools Daniel suggested would be like the following:
tar+gzip
$ tar -cf blah.tar folder/
$ gzip -9 blah.tar
gzip compressed tar I guess this is a combination of the above two commands. Not sure. Didn't try it.
$ tar czvf folder.tgz folder
bzip2
$ tar jcvf filename.tbz folder
After the file was compressed I used Interarchy to move the single zip file to its new location. I also needed to unzip the file. (I also read this.)
To unzip the file I navigated to the directory where the file was located and then used this command:
$ unzip folder.zip folder
I had to use the long path too. So it was really:
$ unzip /home/username/directory/folder.zip folder
What a sense of accomplishment!