Environment:
————

* Scala compiler version 2.10.2
* spark-1.2.0-bin-hadoop2.3
* Hadoop 2.3.0-cdh5.0.3

HDFS Input:
———–

[ramisetty@node1 stack]$ hadoop fs -ls /vijay/mywordcount/
Found 2 items
-rw-r–r–   2 ramisetty supergroup         86 2015-05-13 01:30 /vijay/mywordcount/file1.txt
-rw-r–r–   2 ramisetty supergroup         88 2015-05-13 01:30 /vijay/mywordcount/file2.txt

[ramisetty@node1 stack]$ hadoop fs -cat /vijay/mywordcount/file1.txt

vijay kumar vijay kumar
apple orange vijay kumar
test hello test test test
hello test

[ramisetty@node1 stack]$ hadoop fs -cat /vijay/mywordcount/file2.txt
vijay vijay
test file file test
hello hai test test vijay kumar
vijay vijay kuamr test

SimpleApp.scala
—————

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

/* hadoop */

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.NullWritable
import org.apache.hadoop.mapred.lib.MultipleTextOutputFormat

/* java */
import java.io.Serializable;

import org.apache.log4j.Logger
import org.apache.log4j.Level

/* Custom TextOutput Format */
class RDDMultipleTextOutputFormat extends MultipleTextOutputFormat[Any, Any] {
override def generateActualKey(key: Any, value: Any): Any =
NullWritable.get()

override def generateFileNameForKeyValue(key: Any, value: Any, name: String): String =
return key.asInstanceOf[String] +”-“+ name;   // for output hdfs://Ouptut_dir/inputFilename-part-****
//return key.asInstanceOf[String] +”/”+ name;   // for output hdfs://Ouptut_dir/inputFilename/part-**** [inputFilename – as directory of its partFiles ]
}

/* Spark Context */
object Spark {
val sc = new SparkContext(new SparkConf().setAppName(“test”).setMaster(“local[*]”))
}

/* WordCount Processing */

object Process extends Serializable{
def apply(filename: String): org.apache.spark.rdd.RDD[(String, String)]= {
println(“i am called…..”)
val simple_path = filename.split(‘/’).last;
val lines = Spark.sc.textFile(filename);
val counts     = lines.flatMap(line => line.split(” “)).map(word => (word, 1)).reduceByKey(_ + _); //(word,count)
val fname_word_counts = counts.map( x => (simple_path,x._1+”\t”+ x._2));   // (filename,word\tcount)
fname_word_counts
}
}

object SimpleApp  {

def main(args: Array[String]) {

//Logger.getLogger(“org”).setLevel(Level.OFF)
//Logger.getLogger(“akka”).setLevel(Level.OFF)

// input ans output paths
val INPUT_PATH = “hdfs://master:8020/vijay/mywordcount/”
val OUTPUT_PATH = “hdfs://master:8020/vijay/mywordcount/output/”

// context
val context = Spark.sc
val data = context.wholeTextFiles(INPUT_PATH)

// final output RDD
var output : org.apache.spark.rdd.RDD[(String, String)] = context.emptyRDD

// files to process
val files = data.map { case (filename, content) => filename}

// Apply wordcount Processing on each File received in wholeTextFiles.
files.collect.foreach( filename => {
output = output.union(Process(filename));
})

//output.saveAsTextFile(OUTPUT_PATH);   // this will save output as (filename,word\tcount)
output.saveAsHadoopFile(OUTPUT_PATH, classOf[String], classOf[String],classOf[RDDMultipleTextOutputFormat])  // custom output Format.

//close context
context.stop();

}
}

Compile & create jar :
———————-
/home/ramisetty/scala-2.10.2/bin/scalac -cp /usr/lib/spark-1.2.0-bin-hadoop2.3/lib/spark-assembly-1.2.0-hadoop2.3.0.jar SimpleApp.scala
jar -cvf SimpleApp.jar *.class

Sumbit Jar to Spark:
——————–

/usr/lib/spark-1.2.0-bin-hadoop2.3/bin/spark-submit  –class SimpleApp SimpleApp.jar

HDFS Ouput :
————

[ramisetty@node-1 stack]$ hadoop fs -ls /vijay/mywordcount/output
Found 5 items
-rw-r–r–   3 ramisetty supergroup          0 2015-06-09 03:49 /vijay/mywordcount/output/_SUCCESS
-rw-r–r–   3 ramisetty supergroup         40 2015-06-09 03:49 /vijay/mywordcount/output/file1.txt-part-00000
-rw-r–r–   3 ramisetty supergroup          8 2015-06-09 03:49 /vijay/mywordcount/output/file1.txt-part-00001
-rw-r–r–   3 ramisetty supergroup         44 2015-06-09 03:49 /vijay/mywordcount/output/file2.txt-part-00002
-rw-r–r–   3 ramisetty supergroup          8 2015-06-09 03:49 /vijay/mywordcount/output/file2.txt-part-00003

verify results:
—————

[ramisetty@node-1 stack]$ hadoop fs -cat /vijay/mywordcount/output/file1.txt-part-*

orange  1
kumar   3
hello   2
apple   1
test    5
vijay   3

[ramisetty@node-1 stack]$ hadoop fs -cat /vijay/mywordcount/output/file2.txt-part-*

kumar   1
hello   1
hai     1
file    2
kuamr   1
test    5
vijay   5

Neo4j
—–
Graph Data base..

download :
———-
URL     : http://neo4j.com/download/
package  : neo4j-community-2.3.0-M01-unix.tar.gz

unZip :
——-

tar -xf neo4j-community-2.3.0-M01-unix.tar.gz

Change config values :
———————-

change server port values as per your needs

cat conf/neo4j-server.properties | grep port

#org.neo4j.server.webserver.port=7474   # —- default port
org.neo4j.server.webserver.port=7475    # —- changed by me
# Turn https-support on/off
# https port (for all data, administrative, and UI access)
#org.neo4j.server.webserver.https.port=7473 #—– default port
org.neo4j.server.webserver.https.port=7476  #—– changed by me

Start Server
————
NEO4J_HOME/bin/neo4j start

[ramisetty@cluster-01 neo4j-community-2.3.0-M01]$ bin/neo4j start
Starting Neo4j Server…WARNING: not changing user
process [20390]… waiting for server to be ready…….. OK.
http://localhost:7475/ is ready.

[ramisetty@cluster-01 neo4j-community-2.3.0-M01]$ tail -f  conf/neo4j-server.properties

#*****************************************************************
# Administration client configuration
#*****************************************************************

# location of the servers round-robin database directory. Possible values:
# – absolute path like /var/rrd
# – path relative to the server working directory like data/rrd
# – commented out, will default to the database data directory.
org.neo4j.server.webadmin.rrdb.location=data/rrd

Start Client Shell:
——————-

NEO4J_HOME/bin/neo4j-shell

[ramisetty@cluster-01 neo4j-community-2.3.0-M01]$ bin/neo4j-shell
Welcome to the Neo4j Shell! Enter ‘help’ for a list of commands
NOTE: Remote Neo4j graph database service ‘shell’ at port 1337

neo4j-sh (?)$ help
Available commands: alias begin cd commit create cypher dbinfo drop dump env explain export gsh help index jsh load ls man match merge mknode mkrel mv optional paths planner profile pwd return rm rmnode rmrel rollback runtime schema set start trav unwind using with
Use man for info about each command.
neo4j-sh (?)$

Create some data :
——————

# create Movie Nodes

CREATE (matrix:Movie { title:”The Matrix”,released:1997 })
CREATE (cloudAtlas:Movie { title:”Cloud Atlas”,released:2012 })
CREATE (forrestGump:Movie { title:”Forrest Gump”,released:1994 })

# create Person nodes

CREATE (keanu:Person { name:”Keanu Reeves”, born:1964 })
CREATE (robert:Person { name:”Robert Zemeckis”, born:1951 })
CREATE (tom:Person { name:”Tom Hanks”, born:1956 })

# create RELATIONs

CREATE (tom)-[:ACTED_IN { roles: [“Forrest”]}]->(forrestGump)
CREATE (tom)-[:ACTED_IN { roles: [‘Zachry’]}]->(cloudAtlas)
CREATE (robert)-[:DIRECTED]->(forrestGump)

In Shell
——–

neo4j-sh (?)$ CREATE (matrix:Movie { title:”The Matrix”,released:1997 })
> CREATE (cloudAtlas:Movie { title:”Cloud Atlas”,released:2012 })
> CREATE (forrestGump:Movie { title:”Forrest Gump”,released:1994 })
> ;
+——————-+
| No data returned. |
+——————-+
Nodes created: 3
Properties set: 6
Labels added: 3
1214 ms
neo4j-sh (?)$ CREATE (keanu:Person { name:”Keanu Reeves”, born:1964 })
> CREATE (robert:Person { name:”Robert Zemeckis”, born:1951 })
> CREATE (tom:Person { name:”Tom Hanks”, born:1956 })
> ;
+——————-+
| No data returned. |
+——————-+
Nodes created: 3
Properties set: 6
Labels added: 3
76 ms
neo4j-sh (?)$ CREATE (tom)-[:ACTED_IN { roles: [“Forrest”]}]->(forrestGump)
> CREATE (tom)-[:ACTED_IN { roles: [‘Zachry’]}]->(cloudAtlas)
> CREATE (robert)-[:DIRECTED]->(forrestGump)
> ;
+——————-+
| No data returned. |
+——————-+
Nodes created: 4
Relationships created: 3
Properties set: 2
117 ms
neo4j-sh (?)$

complex Query
—————-
neo4j-sh (?)$ MATCH (p:Person)
> RETURN p, p.name AS name, upper(p.name), coalesce(p.nickname,”n/a”) AS nickname, { name: p.name,
>   label:head(labels(p))} AS person;
+———————————————————————————————————————————————–+
| p                                         | name              | upper(p.name)     | nickname | person                                         |
+———————————————————————————————————————————————–+
| Node[3]{name:”Keanu Reeves”,born:1964}    | “Keanu Reeves”    | “KEANU REEVES”    | “n/a”    | {name -> “Keanu Reeves”, label -> “Person”}    |
| Node[4]{name:”Robert Zemeckis”,born:1951} | “Robert Zemeckis” | “ROBERT ZEMECKIS” | “n/a”    | {name -> “Robert Zemeckis”, label -> “Person”} |
| Node[5]{name:”Tom Hanks”,born:1956}       | “Tom Hanks”       | “TOM HANKS”       | “n/a”    | {name -> “Tom Hanks”, label -> “Person”}       |
+———————————————————————————————————————————————–+
3 rows
213 ms
neo4j-sh (?)$

Loading graph db from csv:
————————————–

[ramisetty@cluster-01 neo4j-community-2.3.0-M01]$ mkdir examples
[ramisetty@cluster-01 neo4j-community-2.3.0-M01]$ cd examples/
[ramisetty@cluster-01 examples]$ pwd
/home/ramisetty/vijay/neo4j/neo4j-community-2.3.0-M01/examples
[ramisetty@cluster-01 examples]$ vim movies.csv
[ramisetty@cluster-01 examples]$ vim persons.csv
[ramisetty@cluster-01 examples]$ vim roles.csv

[ramisetty@cluster-01 examples]$ cat movies.csv
id,title,country,year
1,Wall Street,USA,1987
2,The American President,USA,1995
3,The Shawshank Redemption,USA,1994

[ramisetty@cluster-01 examples]$ cat persons.csv
id,name
1,Charlie Sheen
2,Oliver Stone
3,Michael Douglas
4,Martin Sheen
5,Morgan Freeman

[ramisetty@cluster-01 examples]$ cat roles.csv
personId,movieId,role
1,1,Bud Fox
4,1,Carl Fox
3,1,Gordon Gekko
4,2,A.J. MacInerney
3,2,President Andrew Shepherd
5,3,Ellis Boyd ‘Red’ Redding
[ramisetty@cluster-01 examples]$

==== Note : refer  for graph : http://neo4j.com/docs/2.2.2/cypherdoc-loading-data.html

Load the csv files:
————————–

LOAD CSV WITH HEADERS FROM “file:///home/ramisetty/vijay/neo4j/neo4j-community-2.3.0-M01/examples/movies.csv” AS line
CREATE (m:Movie { id:line.id,title:line.title, released:toInt(line.year)});

LOAD CSV WITH HEADERS FROM “file:///home/ramisetty/vijay/neo4j/neo4j-community-2.3.0-M01/examples/persons.csv” AS line
MERGE (a:Person { id:line.id })
ON CREATE SET a.name=line.name;

LOAD CSV WITH HEADERS FROM “file:///home/ramisetty/vijay/neo4j/neo4j-community-2.3.0-M01/examples/roles.csv” AS line
MATCH (m:Movie { id:line.movieId })
MATCH (a:Person { id:line.personId })
CREATE (a)-[:ACTED_IN { roles: [line.role]}]->(m);

neo4j-sh (?)$ LOAD CSV WITH HEADERS FROM “file:///home/ramisetty/vijay/neo4j/neo4j-community-2.3.0-M01/examples/movies.csv” AS line
> CREATE (m:Movie { id:line.id,title:line.title, released:toInt(line.year)});
+——————-+
| No data returned. |
+——————-+
Nodes created: 3
Properties set: 9
Labels added: 3
158 ms
neo4j-sh (?)$

neo4j-sh (?)$ LOAD CSV WITH HEADERS FROM “file:///home/ramisetty/vijay/neo4j/neo4j-community-2.3.0-M01/examples/persons.csv” AS line
> MERGE (a:Person { id:line.id })
> ON CREATE SET a.name=line.name;
+——————-+
| No data returned. |
+——————-+
Nodes created: 5
Properties set: 10
Labels added: 5
210 ms

neo4j-sh (?)$ LOAD CSV WITH HEADERS FROM “file:///home/ramisetty/vijay/neo4j/neo4j-community-2.3.0-M01/examples/roles.csv” AS line
> MATCH (m:Movie { id:line.movieId })
> MATCH (a:Person { id:line.personId })
> CREATE (a)-[:ACTED_IN { roles: [line.role]}]->(m);
+——————-+
| No data returned. |
+——————-+
Relationships created: 6
Properties set: 6
216 ms

Example Query :   
—————

referhttp://neo4j.com/docs/2.2.2/cypherdoc-movie-database.html
——–

Let’s list all persons and the movies they acted in.

neo4j-sh (?)$ MATCH (person:Person)-[:ACTED_IN]->(movie:Movie)
> RETURN person.name, movie.title;
+————————————————+
| person.name       | movie.title                |
+————————————————+
| “Michael Douglas” | “Wall Street”              |
| “Martin Sheen”    | “Wall Street”              |
| “Charlie Sheen”   | “Wall Street”              |
| “Michael Douglas” | “The American President”   |
| “Martin Sheen”    | “The American President”   |
| “Morgan Freeman”  | “The Shawshank Redemption” |
+————————————————+
6 rows
65 ms
neo4j-sh (?)$

neo4j-sh (?)$ MATCH (movie:Movie)
> RETURN movie.title;
+—————————-+
| movie.title                |
+—————————-+
| “The Matrix”               |
| “Cloud Atlas”              |
| “Forrest Gump”             |
| “Wall Street”              |
| “The American President”   |
| “The Shawshank Redemption” |
+—————————-+
6 rows
194 ms

neo4j-sh (?)$ MATCH (n)-[r]->(m)
> RETURN n AS FROM , r AS `->`, m AS to;
+—————————————————————————————————————————————————————+
| FROM                                    | ->                                                | to                                                              |
+—————————————————————————————————————————————————————+
| Node[6]{}                               | :ACTED_IN[1]{roles:[“Zachry”]}                    | Node[8]{}                                                       |
| Node[6]{}                               | :ACTED_IN[0]{roles:[“Forrest”]}                   | Node[7]{}                                                       |
| Node[9]{}                               | :DIRECTED[2]{}                                    | Node[7]{}                                                       |
| Node[13]{name:”Charlie Sheen”,id:”1″}   | :ACTED_IN[3]{roles:[“Bud Fox”]}                   | Node[10]{title:”Wall Street”,released:1987,id:”1″}              |
| Node[15]{name:”Michael Douglas”,id:”3″} | :ACTED_IN[7]{roles:[“President Andrew Shepherd”]} | Node[11]{id:”2″,title:”The American President”,released:1995}   |
| Node[15]{name:”Michael Douglas”,id:”3″} | :ACTED_IN[5]{roles:[“Gordon Gekko”]}              | Node[10]{title:”Wall Street”,released:1987,id:”1″}              |
| Node[16]{name:”Martin Sheen”,id:”4″}    | :ACTED_IN[6]{roles:[“A.J. MacInerney”]}           | Node[11]{id:”2″,title:”The American President”,released:1995}   |
| Node[16]{name:”Martin Sheen”,id:”4″}    | :ACTED_IN[4]{roles:[“Carl Fox”]}                  | Node[10]{title:”Wall Street”,released:1987,id:”1″}              |
| Node[17]{name:”Morgan Freeman”,id:”5″}  | :ACTED_IN[8]{roles:[“Ellis Boyd ‘Red’ Redding”]}  | Node[12]{id:”3″,title:”The Shawshank Redemption”,released:1994} |
+—————————————————————————————————————————————————————+
9 rows
78 ms
neo4j-sh (?)$

Ambari Setup – Ubuntu

Posted: May 27, 2015 in NoSQL
Tags: ,

 Note: (Ambari hadoop installation supports only Ubuntu 12.0.4 LTS. Below steps are same for 12.0.4 LTS. (Ubuntu 14.0.4.1 Not recommended)

PC & OS -Information:
———————

vijay@vijay:~$ uname -a
Linux vijay.cluster.node1 3.16.0-33-generic #44~14.04.1-Ubuntu SMP Fri Mar 13 10:33:29 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux

vijay@vijay:~$ lsb_release -a
No LSB modules are available.
Distributor ID:    Ubuntu
Description:    Ubuntu 14.04.2 LTS
Release:    14.04
Codename:    trusty
vijay@vijay:~$

RAM      : 4 GB (3.3 GiB available)
Processor : AMD A8-6410 APU with AMD Radeon R5 Graphics × 4
OS type      : 64 bit
Disk      : 109.6 GB

Installation docs:
——————–

http://hortonworks.com/hdp/downloads/

Mode of installation choosed: (Ambari)
—————————————
– Installation using Ambari
– URL : http://docs.hortonworks.com/HDPDocuments/Ambari-2.0.0.0/Ambari_Doc_Suite/Ambari_Install_v20.pdf

Login to root:
————–
vijay@hp-15-notebook-pc:~$ sudo -s
[sudo] password for vijay:

Edit /etc/hosts with FQDN:
————————-
vijay@hp-15-notebook-pc:~$ cat /etc/hosts
127.0.0.1    localhost
127.0.1.1    hp-15-notebook-pc
192.168.1.102    vijay.cluster.node1       # FQDN
# The following lines are desirable for IPv6 capable hosts
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

setup password-less ssh:
————————–

vijay@vijay:~$ hostname -f
vijay.cluster.node1
vijay@vijay:~$ sudo -s
[sudo] password for vijay:
root@vijay:~# ssh-keygen
Generating public/private rsa key pair.
Enter file in which to save the key (/root/.ssh/id_rsa):
/root/.ssh/id_rsa already exists.
Overwrite (y/n)? n
root@vijay:~# cp .ssh/id_rsa .ssh/id_rsa.pub
cp: cannot stat ‘.ssh/id_rsa’: No such file or directory
root@vijay:~# cp /root/.ssh/id_rsa /root/.ssh/id_rsa.pub
root@vijay:~# cat /root/.ssh/id_rsa.pub >> /root/.ssh/authorized_keys
root@vijay:~# chmod 700 ^C
root@vijay:~# chmod 700 /root/.ssh/
root@vijay:~# chmod 600 /root/.ssh/authorized_keys
root@vijay:~#

add ambari.list:
—————-

wget -nv http://public-repo-1.hortonworks.com/ambari/ubuntu12/2.x/updates/2.0.0/ambari.list -O /etc/apt/sources.list.d/ambari.list

root@hp-15-notebook-pc:~# wget -nv http://public-repo-1.hortonworks.com/ambari/ubuntu12/2.x/updates/2.0.0/ambari.list -O /etc/apt/sources.list.d/ambari.list
2015-05-17 18:26:17 URL:http://public-repo-1.hortonworks.com/ambari/ubuntu12/2.x/updates/2.0.0/ambari.list [87/87] -> “/etc/apt/sources.list.d/ambari.list” [1]
root@hp-15-notebook-pc:~# ls /etc/apt/sources.list.d/ambari.list
/etc/apt/sources.list.d/ambari.list

check:
——
root@hp-15-notebook-pc:~# cat /etc/apt/sources.list.d/ambari.list
deb http://public-repo-1.hortonworks.com/ambari/ubuntu12/2.x/updates/2.0.0 Ambari main
root@hp-15-notebook-pc:~#

import keys:
————-

apt-key adv –recv-keys –keyserver keyserver.ubuntu.com B9733A7A07513CAD

root@hp-15-notebook-pc:~# apt-key adv –recv-keys –keyserver keyserver.ubuntu.com B9733A7A07513CAD
Executing: gpg –ignore-time-conflict –no-options –no-default-keyring –homedir /tmp/tmp.ZNL1crp0fx –no-auto-check-trustdb –trust-model always –keyring /etc/apt/trusted.gpg –primary-keyring /etc/apt/trusted.gpg –keyring /etc/apt/trusted.gpg.d/ubuntu-wine-ppa.gpg –recv-keys –keyserver keyserver.ubuntu.com B9733A7A07513CAD
gpg: requesting key 07513CAD from hkp server keyserver.ubuntu.com
gpg: key 07513CAD: public key “Jenkins (HDP Builds) <jenkin@hortonworks.com>” imported
gpg: Total number processed: 1
gpg:               imported: 1  (RSA: 1)
root@hp-15-notebook-pc:~#

do apt-get update:
——————
root@hp-15-notebook-pc:~# apt-get update
Get:1 http://public-repo-1.hortonworks.com Ambari InRelease [3,205 B]
Ign http://extras.ubuntu.com trusty InRelease
Get:2 http://public-repo-1.hortonworks.com Ambari/main amd64 Packages [903 B]
Ign http://dl.google.com stable InRelease
Get:3 http://public-repo-1.hortonworks.com Ambari/main i386 Packages [374 B]
Get:4 http://extras.ubuntu.com trusty Release.gpg [71 B]
Ign http://ppa.launchpad.net trusty InRelease
Get:5 http://dl.google.com stable Release.gpg [198 B]
Ign http://in.archive.ubuntu.com trusty InRelease
Hit http://extras.ubuntu.com trusty Release
Hit http://ppa.launchpad.net trusty Release.gpg
Ign http://in.archive.ubuntu.com trusty-updates InRelease
Ign http://security.ubuntu.com trusty-security InRelease
Get:6 http://dl.google.com stable Release [1,347 B]
Ign http://in.archive.ubuntu.com trusty-backports InRelease
Hit http://extras.ubuntu.com trusty/main Sources
Get:7 http://security.ubuntu.com trusty-security Release.gpg [933 B]
Hit http://ppa.launchpad.net trusty Release
Hit http://in.archive.ubuntu.com trusty Release.gpg
Get:8 http://dl.google.com stable/main amd64 Packages [1,180 B]
Hit http://extras.ubuntu.com trusty/main amd64 Packages
Get:9 http://security.ubuntu.com trusty-security Release [63.5 kB]
Get:10 http://in.archive.ubuntu.com trusty-updates Release.gpg [933 B]
Hit http://ppa.launchpad.net trusty/main amd64 Packages
Hit http://extras.ubuntu.com trusty/main i386 Packages
Get:11 http://dl.google.com stable/main i386 Packages [1,206 B]
Get:12 http://in.archive.ubuntu.com trusty-backports Release.gpg [933 B]
Hit http://ppa.launchpad.net trusty/main i386 Packages
Hit http://in.archive.ubuntu.com trusty Release
Get:13 http://in.archive.ubuntu.com trusty-updates Release [63.5 kB]
Hit http://ppa.launchpad.net trusty/main Translation-en
Get:14 http://security.ubuntu.com trusty-security/main Sources [80.6 kB]
Ign http://public-repo-1.hortonworks.com Ambari/main Translation-en_US
Get:15 http://in.archive.ubuntu.com trusty-backports Release [63.5 kB]
Ign http://public-repo-1.hortonworks.com Ambari/main Translation-en
Ign http://extras.ubuntu.com trusty/main Translation-en_US
Ign http://extras.ubuntu.com trusty/main Translation-en
Ign http://dl.google.com stable/main Translation-en_US
Ign http://dl.google.com stable/main Translation-en
Get:16 http://security.ubuntu.com trusty-security/restricted Sources [2,061 B]
Get:17 http://security.ubuntu.com trusty-security/universe Sources [24.9 kB]
Hit http://in.archive.ubuntu.com trusty/main Sources
Hit http://in.archive.ubuntu.com trusty/restricted Sources
Get:18 http://security.ubuntu.com trusty-security/multiverse Sources [2,335 B]
Hit http://in.archive.ubuntu.com trusty/universe Sources
Get:19 http://security.ubuntu.com trusty-security/main amd64 Packages [268 kB]
Hit http://in.archive.ubuntu.com trusty/multiverse Sources
Hit http://in.archive.ubuntu.com trusty/main amd64 Packages
Hit http://in.archive.ubuntu.com trusty/restricted amd64 Packages
Hit http://in.archive.ubuntu.com trusty/universe amd64 Packages
Hit http://in.archive.ubuntu.com trusty/multiverse amd64 Packages
Hit http://in.archive.ubuntu.com trusty/main i386 Packages
Hit http://in.archive.ubuntu.com trusty/restricted i386 Packages
Hit http://in.archive.ubuntu.com trusty/universe i386 Packages
Hit http://in.archive.ubuntu.com trusty/multiverse i386 Packages
Hit http://in.archive.ubuntu.com trusty/main Translation-en
Hit http://in.archive.ubuntu.com trusty/multiverse Translation-en
Hit http://in.archive.ubuntu.com trusty/restricted Translation-en
Hit http://in.archive.ubuntu.com trusty/universe Translation-en
Get:20 http://in.archive.ubuntu.com trusty-updates/main Sources [202 kB]
Get:21 http://security.ubuntu.com trusty-security/restricted amd64 Packages [8,875 B]
Get:22 http://security.ubuntu.com trusty-security/universe amd64 Packages [103 kB]
Get:23 http://in.archive.ubuntu.com trusty-updates/restricted Sources [2,564 B]
Get:24 http://in.archive.ubuntu.com trusty-updates/universe Sources [117 kB]
Get:25 http://security.ubuntu.com trusty-security/multiverse amd64 Packages [3,680 B]
Get:26 http://security.ubuntu.com trusty-security/main i386 Packages [257 kB]
Get:27 http://in.archive.ubuntu.com trusty-updates/multiverse Sources [5,161 B]
Get:28 http://in.archive.ubuntu.com trusty-updates/main amd64 Packages [517 kB]
Get:29 http://security.ubuntu.com trusty-security/restricted i386 Packages [8,846 B]
Get:30 http://security.ubuntu.com trusty-security/universe i386 Packages [104 kB]
Get:31 http://security.ubuntu.com trusty-security/multiverse i386 Packages [3,828 B]
Get:32 http://security.ubuntu.com trusty-security/main Translation-en [136 kB]
Get:33 http://in.archive.ubuntu.com trusty-updates/restricted amd64 Packages [9,238 B]
Get:34 http://in.archive.ubuntu.com trusty-updates/universe amd64 Packages [279 kB]
Get:35 http://security.ubuntu.com trusty-security/multiverse Translation-en [1,679 B]
Hit http://security.ubuntu.com trusty-security/restricted Translation-en
Get:36 http://security.ubuntu.com trusty-security/universe Translation-en [58.5 kB]
Get:37 http://in.archive.ubuntu.com trusty-updates/multiverse amd64 Packages [12.0 kB]
Get:38 http://in.archive.ubuntu.com trusty-updates/main i386 Packages [505 kB]
Get:39 http://in.archive.ubuntu.com trusty-updates/restricted i386 Packages [9,256 B]
Get:40 http://in.archive.ubuntu.com trusty-updates/universe i386 Packages [280 kB]
Get:41 http://in.archive.ubuntu.com trusty-updates/multiverse i386 Packages [12.1 kB]
Get:42 http://in.archive.ubuntu.com trusty-updates/main Translation-en [245 kB]
Get:43 http://in.archive.ubuntu.com trusty-updates/multiverse Translation-en [6,148 B]
Hit http://in.archive.ubuntu.com trusty-updates/restricted Translation-en
Get:44 http://in.archive.ubuntu.com trusty-updates/universe Translation-en [146 kB]
Get:45 http://in.archive.ubuntu.com trusty-backports/main Sources [5,851 B]
Get:46 http://in.archive.ubuntu.com trusty-backports/restricted Sources [28 B]
Get:47 http://in.archive.ubuntu.com trusty-backports/universe Sources [24.8 kB]
Get:48 http://in.archive.ubuntu.com trusty-backports/multiverse Sources [1,898 B]
Get:49 http://in.archive.ubuntu.com trusty-backports/main amd64 Packages [6,256 B]
Get:50 http://in.archive.ubuntu.com trusty-backports/restricted amd64 Packages [28 B]
Get:51 http://in.archive.ubuntu.com trusty-backports/universe amd64 Packages [28.5 kB]
Get:52 http://in.archive.ubuntu.com trusty-backports/multiverse amd64 Packages [1,245 B]
Get:53 http://in.archive.ubuntu.com trusty-backports/main i386 Packages [6,285 B]
Get:54 http://in.archive.ubuntu.com trusty-backports/restricted i386 Packages [28 B]
Get:55 http://in.archive.ubuntu.com trusty-backports/universe i386 Packages [28.5 kB]
Get:56 http://in.archive.ubuntu.com trusty-backports/multiverse i386 Packages [1,249 B]
Hit http://in.archive.ubuntu.com trusty-backports/main Translation-en
Hit http://in.archive.ubuntu.com trusty-backports/multiverse Translation-en
Hit http://in.archive.ubuntu.com trusty-backports/restricted Translation-en
Hit http://in.archive.ubuntu.com trusty-backports/universe Translation-en
Ign http://in.archive.ubuntu.com trusty/main Translation-en_US
Ign http://in.archive.ubuntu.com trusty/multiverse Translation-en_US
Ign http://in.archive.ubuntu.com trusty/restricted Translation-en_US
Ign http://in.archive.ubuntu.com trusty/universe Translation-en_US
Fetched 3,717 kB in 2min 20s (26.4 kB/s)
Reading package lists… Done
root@hp-15-notebook-pc:~#

check package listing:
———————-
root@hp-15-notebook-pc:~# apt-cache pkgnames | grep ambari
ambari-agent
ambari-server
ambari-metrics-assembly
root@hp-15-notebook-pc:~#

Install ambari server:
————————
apt-get install ambari-server

root@hp-15-notebook-pc:~# apt-get install ambari-server
Reading package lists… Done
Building dependency tree
Reading state information… Done
The following packages were automatically installed and are no longer required:
libepoxy0 libevdev2 libllvm3.5 signon-keyring-extension
Use ‘apt-get autoremove’ to remove them.
The following extra packages will be installed:
libpq5 postgresql postgresql-9.3 postgresql-client-9.3
postgresql-client-common postgresql-common
Suggested packages:
oidentd ident-server locales-all postgresql-doc-9.3
The following NEW packages will be installed:
ambari-server libpq5 postgresql postgresql-9.3 postgresql-client-9.3
postgresql-client-common postgresql-common
0 upgraded, 7 newly installed, 0 to remove and 163 not upgraded.
Need to get 93.8 MB of archives.
After this operation, 132 MB of additional disk space will be used.
Do you want to continue? [Y/n] Y
Get:1 http://in.archive.ubuntu.com/ubuntu/ trusty-updates/main postgresql-client-common all 154ubuntu1 [25.4 kB]
Get:2 http://public-repo-1.hortonworks.com/ambari/ubuntu12/2.x/updates/2.0.0/ Ambari/main ambari-server amd64 2.0.0-151 [90.1 MB]
Err http://in.archive.ubuntu.com/ubuntu/ trusty-updates/main libpq5 amd64 9.3.6-0ubuntu0.14.04
Could not resolve ‘mlife.mtsindia.in?isdn=918925639656&old_url=in.archive.ubuntu.com’
Get:3 http://in.archive.ubuntu.com/ubuntu/ trusty-updates/main postgresql-client-9.3 amd64 9.3.6-0ubuntu0.14.04 [782 kB]
Get:4 http://security.ubuntu.com/ubuntu/ trusty-security/main libpq5 amd64 9.3.6-0ubuntu0.14.04 [80.1 kB]
Get:5 http://in.archive.ubuntu.com/ubuntu/ trusty-updates/main postgresql-common all 154ubuntu1 [103 kB]
Get:6 http://in.archive.ubuntu.com/ubuntu/ trusty-updates/main postgresql-9.3 amd64 9.3.6-0ubuntu0.14.04 [2,683 kB]
Get:7 http://in.archive.ubuntu.com/ubuntu/ trusty-updates/main postgresql all 9.3+154ubuntu1 [5,038 B]
Fetched 93.8 MB in 23min 38s (66.1 kB/s)
Preconfiguring packages …
Selecting previously unselected package libpq5.
(Reading database … 234893 files and directories currently installed.)
Preparing to unpack …/libpq5_9.3.6-0ubuntu0.14.04_amd64.deb …
Unpacking libpq5 (9.3.6-0ubuntu0.14.04) …
Selecting previously unselected package postgresql-client-common.
Preparing to unpack …/postgresql-client-common_154ubuntu1_all.deb …
Unpacking postgresql-client-common (154ubuntu1) …
Selecting previously unselected package postgresql-client-9.3.
Preparing to unpack …/postgresql-client-9.3_9.3.6-0ubuntu0.14.04_amd64.deb …
Unpacking postgresql-client-9.3 (9.3.6-0ubuntu0.14.04) …
Selecting previously unselected package postgresql-common.
Preparing to unpack …/postgresql-common_154ubuntu1_all.deb …
Adding ‘diversion of /usr/bin/pg_config to /usr/bin/pg_config.libpq-dev by postgresql-common’
Unpacking postgresql-common (154ubuntu1) …
Selecting previously unselected package postgresql-9.3.
Preparing to unpack …/postgresql-9.3_9.3.6-0ubuntu0.14.04_amd64.deb …
Unpacking postgresql-9.3 (9.3.6-0ubuntu0.14.04) …
Selecting previously unselected package postgresql.
Preparing to unpack …/postgresql_9.3+154ubuntu1_all.deb …
Unpacking postgresql (9.3+154ubuntu1) …
Selecting previously unselected package ambari-server.
Preparing to unpack …/ambari-server_2.0.0-151_amd64.deb …
Unpacking ambari-server (2.0.0-151) …
Processing triggers for man-db (2.6.7.1-1ubuntu1) …
Processing triggers for ureadahead (0.100.0-16) …
ureadahead will be reprofiled on next reboot
Setting up libpq5 (9.3.6-0ubuntu0.14.04) …
Setting up postgresql-client-common (154ubuntu1) …
Setting up postgresql-client-9.3 (9.3.6-0ubuntu0.14.04) …
update-alternatives: using /usr/share/postgresql/9.3/man/man1/psql.1.gz to provide /usr/share/man/man1/psql.1.gz (psql.1.gz) in auto mode
Setting up postgresql-common (154ubuntu1) …
Adding user postgres to group ssl-cert

Creating config file /etc/logrotate.d/postgresql-common with new version
Building PostgreSQL dictionaries from installed myspell/hunspell packages…
en_au
en_ca
en_gb
en_us
en_za
Removing obsolete dictionary files:
* No PostgreSQL clusters exist; see “man pg_createcluster”
Processing triggers for ureadahead (0.100.0-16) …
Setting up postgresql-9.3 (9.3.6-0ubuntu0.14.04) …
Creating new cluster 9.3/main …
config /etc/postgresql/9.3/main
data   /var/lib/postgresql/9.3/main
locale en_US.UTF-8
port   5432
update-alternatives: using /usr/share/postgresql/9.3/man/man1/postmaster.1.gz to provide /usr/share/man/man1/postmaster.1.gz (postmaster.1.gz) in auto mode
* Starting PostgreSQL 9.3 database server                                                                                                 [ OK ]
Setting up postgresql (9.3+154ubuntu1) …
Setting up ambari-server (2.0.0-151) …
update-rc.d: warning: /etc/init.d/ambari-server missing LSB information
update-rc.d: see <http://wiki.debian.org/LSBInitScripts&gt;
Adding system startup for /etc/init.d/ambari-server …
/etc/rc0.d/K20ambari-server -> ../init.d/ambari-server
/etc/rc1.d/K20ambari-server -> ../init.d/ambari-server
/etc/rc6.d/K20ambari-server -> ../init.d/ambari-server
/etc/rc2.d/S20ambari-server -> ../init.d/ambari-server
/etc/rc3.d/S20ambari-server -> ../init.d/ambari-server
/etc/rc4.d/S20ambari-server -> ../init.d/ambari-server
/etc/rc5.d/S20ambari-server -> ../init.d/ambari-server
Processing triggers for libc-bin (2.19-0ubuntu6.6) …
root@hp-15-notebook-pc:~#

Ambari Server Setup:
———————-

vijay@vijay:~$ sudo -s
[sudo] password for vijay:

root@vijay:~# ssh root@localhost
Welcome to Ubuntu 14.04.2 LTS (GNU/Linux 3.16.0-33-generic x86_64)

* Documentation:  https://help.ubuntu.com/

Last login: Sun May 17 19:06:45 2015 from vijay.cluster.node1

root@vijay:~# ambari-server status
Using python  /usr/bin/python2.7
Ambari-server status
Ambari Server not running. Stale PID File at: /var/run/ambari-server/ambari-server.pid

———————————— See the ERROR————————
root@vijay:~# ambari-server start
Using python  /usr/bin/python2.7
Starting ambari-server
ERROR: Exiting with exit code -1.
REASON: DB Name property not set in config file.
– If this is a new setup, then run the “ambari-server setup” command to create the user
– If this is an upgrade of an existing setup, run the “ambari-server upgrade” command.
Refer to the Ambari documentation for more information on setup and upgrade.

Ambari Server Setup Command: (This will download Oracle JDK ~ 135.8 MB)
—————————-

root@vijay:~# ambari-server setup
Using python  /usr/bin/python2.7
Setup ambari-server
Checking SELinux…
WARNING: Could not run /usr/sbin/sestatus: OK
Customize user account for ambari-server daemon [y/n] (n)? n
Adjusting ambari-server permissions and ownership…
Checking iptables…
Checking JDK…
[1] Oracle JDK 1.7
[2] Oracle JDK 1.6
[3] – Custom JDK
==============================================================================
Enter choice (1): 1
To download the Oracle JDK and the Java Cryptography Extension (JCE) Policy Files you must accept the license terms found at http://www.oracle.com/technetwork/java/javase/terms/license/index.html and not accepting will cancel the Ambari Server setup and you must install the JDK and JCE files manually.
Do you accept the Oracle Binary Code License Agreement [y/n] (y)? y
Downloading JDK from http://public-repo-1.hortonworks.com/ARTIFACTS/jdk-7u67-linux-x64.tar.gz to /var/lib/ambari-server/resources/jdk-7u67-linux-x64.tar.gz
jdk-7u67-linux-x64.tar.gz… 100% (135.8 MB of 135.8 MB)
Successfully downloaded JDK distribution to /var/lib/ambari-server/resources/jdk-7u67-linux-x64.tar.gz
Installing JDK to /usr/jdk64/
Successfully installed JDK to /usr/jdk64/
Downloading JCE Policy archive from http://public-repo-1.hortonworks.com/ARTIFACTS/UnlimitedJCEPolicyJDK7.zip to /var/lib/ambari-server/resources/UnlimitedJCEPolicyJDK7.zip
UnlimitedJCEPolicyJDK7.zip… 100%
Successfully downloaded JCE Policy archive to /var/lib/ambari-server/resources/UnlimitedJCEPolicyJDK7.zip
Installing JCE policy…
Completing setup…
Configuring database…
Enter advanced database configuration [y/n] (n)? n
Configuring database…
Default properties detected. Using built-in database.
Configuring ambari database…
Checking PostgreSQL…
About to start PostgreSQL
Configuring local database…
Connecting to local database…done.
Configuring PostgreSQL…
Extracting system views…
.ambari-admin-2.0.0.151.jar
..
Adjusting ambari-server permissions and ownership…
Ambari Server ‘setup’ completed successfully.
root@vijay:~#

————————–Note ——————————

The default PostgreSQL database name is
ambari. The default user name and password are ambari/bigdata.
————————————————————-

Start Ambari Server:
——————–
root@vijay:~# ambari-server status
Using python  /usr/bin/python2.7
Ambari-server status
Ambari Server not running. Stale PID File at: /var/run/ambari-server/ambari-server.pid

root@vijay:~# ambari-server start
Using python  /usr/bin/python2.7
Starting ambari-server
Ambari Server running with administrator privileges.
About to start PostgreSQL
Organizing resource files at /var/lib/ambari-server/resources…
Server PID at: /var/run/ambari-server/ambari-server.pid
Server out at: /var/log/ambari-server/ambari-server.out
Server log at: /var/log/ambari-server/ambari-server.log
Waiting for server start………………..
Ambari Server ‘start’ completed successfully.

root@vijay:~# ambari-server status
Using python  /usr/bin/python2.7
Ambari-server status
Ambari Server running
Found Ambari Server PID: 10262 at: /var/run/ambari-server/ambari-server.pid
root@vijay:~#

—————————————— Ambari Started——————————-

Access Ambari web UI:
———————-

root@vijay:~# hostname
vijay.cluster.node1
root@vijay:~#

From Browser:
————–
http://vijay.cluster.node1:8080
http://vijay.cluster.node1:8080/#/login

—————————————————Note—————————-
Log in to the Ambari Server using the default user name/password: admin/admin. You
can change these credentials later
———————————————————————————–

AMBARI – will guide you the rest HDP setup from UI.

PC & OS -Information

vijay@hp-15-notebook-pc:~$ uname -a
Linux vijay.cluster.node1 3.16.0-33-generic #44~14.04.1-Ubuntu SMP Fri Mar 13 10:33:29 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux

vijay@hp-15-notebook-pc:~$ lsb_release -a
No LSB modules are available.
Distributor ID:    Ubuntu
Description:    Ubuntu 14.04.2 LTS
Release:    14.04
Codename:    trusty

RAM      : 4 GB (3.3 GiB available)
Processor : AMD A8-6410 APU with AMD Radeon R5 Graphics × 4
OS type      : 64 bit
Disk      : 109.6 GB

Download

http://apache.cs.utah.edu/cassandra/2.0.15/apache-cassandra-2.0.15-bin.tar.gz

Make Sure Java is available

vijay@hp-15-notebook-pc:~/developer/installations$ java -version
java version “1.7.0_45”
Java(TM) SE Runtime Environment (build 1.7.0_45-b18)
Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode)

Unzip

vijay@hp-15-notebook-pc:~/developer/installations$ tar -xf apache-cassandra-2.0.15-bin.tar.gz

vijay@hp-15-notebook-pc:~/developer/installations$ ls -l
total 17704
drwxrwxr-x 9 vijay vijay     4096 May 20 21:24 apache-cassandra-2.0.15
-rw-rw-r– 1 vijay vijay 18111795 May 20 21:22 apache-cassandra-2.0.15-bin.tar.gz
drwxr-xr-x 8 vijay vijay     4096 Oct  8  2013 jdk1.7.0_45
drwxr-xr-x 5 vijay vijay     4096 Apr 19 20:27 sublime
vijay@hp-15-notebook-pc:~/developer/installations$ rm -r apache-cassandra-2.0.15-bin.tar.gz
vijay@hp-15-notebook-pc:~/developer/installations$ ls
apache-cassandra-2.0.15  jdk1.7.0_45  sublime
vijay@hp-15-notebook-pc:~/developer/installations$ cd apache-cassandra-2.0.15/
vijay@hp-15-notebook-pc:~/developer/installations/apache-cassandra-2.0.15$ ls
bin  CHANGES.txt  conf  interface  javadoc  lib  LICENSE.txt  NEWS.txt  NOTICE.txt  pylib  tools
vijay@hp-15-notebook-pc:~/developer/installations/apache-cassandra-2.0.15$ ls -l
total 320
drwxr-xr-x 2 vijay vijay   4096 May 20 21:24 bin
-rw-r–r– 1 vijay vijay 220336 May 13 21:17 CHANGES.txt
drwxr-xr-x 3 vijay vijay   4096 May 20 21:24 conf
drwxr-xr-x 2 vijay vijay   4096 May 20 21:24 interface
drwxr-xr-x 4 vijay vijay   4096 May 20 21:24 javadoc
drwxr-xr-x 3 vijay vijay   4096 May 20 21:24 lib
-rw-r–r– 1 vijay vijay  11609 May 13 21:17 LICENSE.txt
-rw-r–r– 1 vijay vijay  60141 May 13 21:17 NEWS.txt
-rw-r–r– 1 vijay vijay   2030 May 13 21:17 NOTICE.txt
drwxr-xr-x 3 vijay vijay   4096 May 20 21:24 pylib
drwxr-xr-x 4 vijay vijay   4096 May 13 21:17 tools
vijay@hp-15-notebook-pc:~/developer/installations/apache-cassandra-2.0.15$ pwd
/home/vijay/developer/installations/apache-cassandra-2.0.15
vijay@hp-15-notebook-pc:~/developer/installations/apache-cassandra-2.0.15$ cd bin/
vijay@hp-15-notebook-pc:~/developer/installations/apache-cassandra-2.0.15/bin$ ls
cassandra          cassandra.in.sh  debug-cql.bat     nodetool.bat      sstablekeys.bat    sstablescrub.bat  sstableupgrade.bat
cassandra.bat      cqlsh            json2sstable      sstable2json      sstableloader      sstablesplit      stop-server
cassandra-cli      cqlsh.bat        json2sstable.bat  sstable2json.bat  sstableloader.bat  sstablesplit.bat
cassandra-cli.bat  debug-cql        nodetool          sstablekeys       sstablescrub       sstableupgrade
vijay@hp-15-notebook-pc:~/developer/installations/apache-cassandra-2.0.15/bin$

Change Directory Permissions

Next, make sure that the folders Cassandra accesses, such as the log folder, exists and that Cassandra has the right to write on it:

sudo mkdir /var/lib/cassandra
sudo mkdir /var/log/cassandra
sudo chown -R $USER:$GROUP /var/lib/cassandra
sudo chown -R $USER:$GROUP /var/log/cassandra

Set Cassandra Path and Home

vijay@cluster-01:~$ sudo vim ~/.bashrc
vijay@cluster-01:~$ source ~/.bashrc
vijay@cluster-01:~$ tail ~/.bashrc

#——— Cassandra

export export CASSANDRA_HOME=/home/vijay/developer/installations/apache-cassandra-2.0.15
export PATH=$PATH:$CASSANDRA_HOME/bin
vijay@cluster-01:~$

Start Cassandra:

restart terminal session and run below command.

vijay@cluster-01:~$ cassandra -p

STOP Cassandra:

vijay@cluster-01:~$ jps
4180 CassandraDaemon
4414 Jps
vijay@cluster-01:~$ sudo kill -9 4180   (or) pkill -f CassandraDaemon
[sudo] password for vijay:
vijay@cluster-01:~$ jps
4439 Jps

Start Cassandra and start CLI – cqlsh

vijay@cluster-01:~$ cqlsh
Connected to Test Cluster at localhost:9160.
[cqlsh 4.1.1 | Cassandra 2.0.15 | CQL spec 3.1.1 | Thrift protocol 19.39.0]
Use HELP for help.

Create a keyspace — a namespace of tables.
cqlsh> CREATE KEYSPACE mykeyspace
… WITH REPLICATION = { ‘class’ : ‘SimpleStrategy’, ‘replication_factor’ : 1 };

Help
cqlsh> help

Documented shell commands:
===========================
CAPTURE      COPY  DESCRIBE  EXPAND  SHOW    TRACING
CONSISTENCY  DESC  EXIT      HELP    SOURCE

CQL help topics:
================
ALTER                        CREATE_TABLE_OPTIONS  SELECT
ALTER_ADD                    CREATE_TABLE_TYPES    SELECT_COLUMNFAMILY
ALTER_ALTER                  CREATE_USER           SELECT_EXPR
ALTER_DROP                   DELETE                SELECT_LIMIT
ALTER_RENAME                 DELETE_COLUMNS        SELECT_TABLE
ALTER_USER                   DELETE_USING          SELECT_WHERE
ALTER_WITH                   DELETE_WHERE          TEXT_OUTPUT
APPLY                        DROP                  TIMESTAMP_INPUT
ASCII_OUTPUT                 DROP_COLUMNFAMILY     TIMESTAMP_OUTPUT
BEGIN                        DROP_INDEX            TRUNCATE
BLOB_INPUT                   DROP_KEYSPACE         TYPES
BOOLEAN_INPUT                DROP_TABLE            UPDATE
COMPOUND_PRIMARY_KEYS        DROP_USER             UPDATE_COUNTERS
CREATE                       GRANT                 UPDATE_SET
CREATE_COLUMNFAMILY          INSERT                UPDATE_USING
CREATE_COLUMNFAMILY_OPTIONS  LIST                  UPDATE_WHERE
CREATE_COLUMNFAMILY_TYPES    LIST_PERMISSIONS      USE
CREATE_INDEX                 LIST_USERS            UUID_INPUT
CREATE_KEYSPACE              PERMISSIONS
CREATE_TABLE                 REVOKE

cqlsh> USE mykeyspace;
cqlsh:mykeyspace>

Create table

cqlsh> USE mykeyspace;
cqlsh:mykeyspace> CREATE TABLE users (
…   user_id int PRIMARY KEY,
…   fname text,
…   lname text
… );

 

Insert the data

cqlsh:mykeyspace> INSERT INTO users (user_id,  fname, lname)
…   VALUES (1745, ‘john’, ‘smith’);
cqlsh:mykeyspace> INSERT INTO users (user_id,  fname, lname)
…   VALUES (1744, ‘john’, ‘doe’);
cqlsh:mykeyspace> INSERT INTO users (user_id,  fname, lname)
…   VALUES (1746, ‘john’, ‘smith’);

Check the entries

cqlsh:mykeyspace> SELECT * FROM users;

user_id | fname | lname
———+——-+——-
1745 |  john | smith
1744 |  john |   doe
1746 |  john | smith

(3 rows)

cqlsh:mykeyspace>

Indexing

—You can retrieve data about users whose last name is smith by creating an index, then querying the table as follows:

cqlsh:mykeyspace> CREATE INDEX ON users (lname);
cqlsh:mykeyspace> SELECT * FROM users WHERE lname = ‘smith’;

user_id | fname | lname
———+——-+——-
1745 |  john | smith
1746 |  john | smith

(2 rows)

cqlsh:mykeyspace>

Node tool

vijay@cluster-01:~$ nodetool -h localhost  status
Note: Ownership information does not include topology; for complete information, specify a keyspace
Datacenter: datacenter1
=======================
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
—  Address    Load       Tokens  Owns   Host ID                               Rack
UN  127.0.0.1  117.04 KB  256     100.0%  971c1a4f-4e49-4400-801a-2e27d46af032  rack1
vijay@cluster-01:~$

For GUI Installation 

http://downloads.datastax.com/community/opscenter-5.0.tar.gz

free : https://github.com/tomekkup/helenos