Symfony and Mac

After getting this beautiful error:

Warning: PDO::__construct(): [2002] No such file or directory (trying to connect via unix:///var/mysql/mysql.sock) in /some/path/here/symfony/lib/plugins/sfDoctrinePlugin/lib/vendor/doctrine/Doctrine/Connection.php on line 470

PDO Connection Error: SQLSTATE[HY000] [2002] No such file or directory

Some solutions says that first you need to find the php.ini (that’s like looking for the Holy Grail or something similar!). So, the best way is to «join to your enemies» by creating a link to the current folder where the mysql.sock is created through:

cd /var; 
sudo ln -s /Applications/MAMP/tmp/mysql mysql
(Taken from Stackoverflow)

Particularly, I used MAMP.


Publicado en Symfony | Etiquetado , | Deja un comentario

Clear terminal on ANSI way.

After searching for a way of clearing the terminal (or console, as you prefer), I found this post «Language Agnostic Clearing of Console Screen: Without clear() or cls()«. It’s a pretty good option, easy and without complicated tricks to do this.

On Java:

public class ansiclrscr {
    public static void main(String[] argv) {
        System.out.print("foo\nbar\nbaz\nquux");
        System.out.print("33[2J33[;H");
        System.out.print( "I am at cursor position 0,0 on a clean  screen.");
    }
}
Simple, isn't it?
Publicado en Links | Etiquetado , , , | Deja un comentario

ERROR 4010: Cannot find hadoop configurations in classpath

Setting up pig for e2e tests isn’t a daily-work task, it requires patience and sometimes more patience.

Let me first explain the situation: Hadoop 1.0.1-SNAPSHOT, Apache Pig version 0.10.0-SNAPSHOT (r1156275) and Pig version 0.8.1 (obviously, you need an old towards tests).

One common exception thrown while configuring old pig was this:

ERROR 4010: Cannot find hadoop configurations in classpath

It requires in some way to specify the path to core-site.xml or hadoop-site.xml (for older versions). I solved this by adding an env var PIG_CONF_DIR with the path to  «hadoop configuration directory».

And that’s it. And as usual: It should works!

Publicado en Hadoop, Pig | Etiquetado , , , | Deja un comentario

ERROR 2999: Unexpected internal error. Failed to create DataStorage

After some days looking for how to solve this annoying exception:

ERROR 2999: Unexpected internal error. Failed to create DataStorage

I think that this is best solution (after all is on Apache-FAQ): Apache – FAQ – Q: What shall I do if I saw «Failed to create DataStorage»?

On few words:

  1. deploy with ant: «ant»
  2. copy your hadoop jars (core and test) to /path/to/pig/build/ivy/lib/Pig
  3. I did something else: modify at libraries.properties the version of hadoop (core and test). This file is at /path/to/pig/ivy/libraries.properties.
  4. deploy with ant: «ant» again
  5. copy your pig.jar and rename as pig-<version>-core.jar

and it should work!

P.S.: This issue will be solved in Pig 0.9.1 and beyond.

Publicado en Hadoop, Pig | Etiquetado , , , , | Deja un comentario

Hadoop can’t replicate to nodes

«org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:aavendan cause:java.io.IOException: File /usr/local/hadoop/tmp/hadoop-aavendan/mapred/system/jobtracker.info could only be replicated to 0 nodes, instead of 1»

This exception was thrown twice during my tests, looks weird (because I´m working in a local node, so no reason!)

After looking for around for a solution, I found this key reply

Make sure you erase any existing input file/folder on the dfs first.

So, I did it and that’s it! Then be sure to format your namenode, be sure for not being at safemode

Publicado en Hadoop, Varios | Etiquetado , | Deja un comentario