Monday 30 December 2013

New Technology to learn and earn - Future job creator - services provider - step in to new technology

You are bored with your current IT job, or Thinking where to go from here.Worried about future and your job, want to learn something new, then stay on to this page and read it out.
 
               So in the past we learnt Linux administration,vmware,windows,san,networking,backup technologies,server hardware and work in specific field. but sometimes your mind fly around on thinking how to earn more,what technology would fetch me new package, new designation etc...etc.
Today i would like to Introduce you to a new technology, which would really stay on earth for a long time.
seriously, when i learnt this technology and see its future, i feel that yes this is it where i wanted to be.

Above said lines were not to influence you, this is what i understand,feel about this technology.so what is it ? what is the future of this technology ? what job i can fetch ? what package ??? what will be my next mission ????? Let try to find answers to this.

The new technology is known as Big-Data and the player of this big data field is Hadoop managed now by Apache. Off course BigData Analytic can be implemented with other technologies, then why Apache Hadoop ? you find the answer for this. :)


What Is Apache Hadoop?

The Apache™ Hadoop® project develops open-source software for reliable, scalable, distributed computing.
The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-availability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-available service on top of a cluster of computers, each of which may be prone to failures.
The project includes these modules:
  • Hadoop Common: The common utilities that support the other Hadoop modules.
  • Hadoop Distributed File System (HDFS™): A distributed file system that provides high-throughput access to application data.
  • Hadoop YARN: A framework for job scheduling and cluster resource management.
  • Hadoop MapReduce: A YARN-based system for parallel processing of large data sets.

Dont get tensed just read on.......

Other Hadoop-related projects at Apache include:

  • Ambari™: A web-based tool for provisioning, managing, and monitoring Apache Hadoop clusters which includes support for Hadoop HDFS, Hadoop MapReduce, Hive, HCatalog, HBase, ZooKeeper, Oozie, Pig and Sqoop. Ambari also provides a dashboard for viewing cluster health such as heatmaps and ability to view MapReduce, Pig and Hive applications visually alongwith features to diagnose their performance characteristics in a user-friendly manner.
  • Avro™: A data serialization system.
  • Cassandra™: A scalable multi-master database with no single points of failure.
  • Chukwa™: A data collection system for managing large distributed systems.
  • HBase™: A scalable, distributed database that supports structured data storage for large tables.
  • Hive™: A data warehouse infrastructure that provides data summarization and ad hoc querying.
  • Mahout™: A Scalable machine learning and data mining library.
  • Pig™: A high-level data-flow language and execution framework for parallel computation.
  • ZooKeeper™: A high-performance coordination service for distributed applications.

Now, may be you are thinking, ! god again new technology need to learn a lot or may be what the HELL is this again ? or do i really need this blah blah blah. some Linux Admins may think that may be china got some new prodcut ??? :(

But frankly speaking you will get on to it, its pritty simple, atleast for me its easier than learning programming in c or or leaning vmware etc.

As per me initially you just need to know Basic Linux admin. But i would prefer below skills to get more confident.
1) Linux Administration at least L2 level
2) Knowledge of atleast launching a vm and installing applications
3) good to have hardware/server information like Rack,switches,blade servers,rack mounted servers etc,dont worry you can and will learn this as you goahead in your career.
4) Monitoring softwares like Nagios,Ganglia,cacti,it360,manage engine. preferable the first two.
5) java is what you should be knowing if you really want to be on the top of job hunters ..

above skills you should be already possessing atleast first 4 of them, if not, its not too late .. !!!
So new terms, scary right don't worry just read on and you will learn and find it actually very easy to understand.


So the different roles you can play or wish to work on are :

*Hadoop - Development (Any programming knowledge can takeup this course)
* Hadoop - Administration (Linux, Unix & Any System or Network Admin can take up this course)
* ETL - Hadoop ( Any Dataware house, Database, Testing professional can take up this course)* Hadoop - BI Analytics (Any one having basic Hadoop knowledge can take up this course)



Stay tuned this was only the introduction   more to come in my next post .........