|Course or Certification Name||Category||Location||Mode of learning|
|Apache Storm Introduction||Big Data||Online self study|
|Spark, Scala and Storm combo||Big Data||Online self study|
|Real-Time Analytics with Apache Storm||Analytics & Statistics tools||Online self study|
|The Ultimate Hands-On Hadoop - Tame your Big Data!||Big Data||Online self study|
|Big Data Bootcamp||Big Data||Noida , Delhi , Gurgaon , Chandigarh , Bangalore , Hyderabad , Chennai , Ernakulam||Online Classroom|
|Big Data Hadoop Architect Masters Program||Hadoop Administration||Noida , Delhi , Gurgaon , Chandigarh , Bangalore , Hyderabad , Chennai , Ernakulam||Online Classroom|
|Senior Software Architect Masters - 200+ courses||Emerging Web Technologies||Online self study|
|Software Masters Program- 200+ courses||Emerging Web Technologies||Online self study|
|Software Master Subscription- 12 months||Emerging Web Technologies||Online self study|
|Software Master Subscription- 6 months||Emerging Web Technologies||Online self study|
|Analytics & Data Science Master Subscription||Data Science||Online self study|
|Big Data Hadoop Expert||Big Data||Online self study|
|Learn Hadoop and Big Data by Building Projects||Big Data||Online self study|
|Postgraduate Diploma in Data Science||Data Science||Classroom|
|Apache Spark Fundamentals||Big Data||Online self study|
Apache Storm is a fast and scalable open source distribution system that drives real-time computations. Any aspirants in big data or data analysis can learn this tool to enhance their career path. | This Apache Storm Introduction course has been designed to provide the fundamental knowledge and training so that the learners can have an in-depth understanding of the concepts | The candidates will also get to know about various integrations, which will be introduced in this course | Developed by a group of experts in the field, this course offers candidates hands-on experience on how to deploy the Storm architecture | With quality content, the course also provides a course-completion certificate
It is an all-in-course designed to give a 360 degree overview of real-time processing of unbound data streams using Apache Storm and creating applications in Spark using Scala programming. The major topics include concepts of Big Data world, Batch Analysis, Types of Analytics and usage of Apache Storm for real-time Big Data Analytics, Comparison between Spark and Hadoop and Techniques to increase your application performance, enabling high-speed processing.
In your final project, follow real-time trending topics by implementing the data pipeline to visualize only tweets that contain Top worldwide hashtags. Extend your project by exploring the Twitter API, or any data source, alongside Hackathon participants as they design their own ideas, receive feedback from Karthik, and open source a final project calculating real-time tweet sentiment and geolocation to drive a U.S. Map.
The world of Hadoop and "Big Data" can be intimidating - hundreds of different technologies with cryptic names form the Hadoop ecosystem. With this Hadoop tutorial, you'll not only understand what those systems are and how they fit together - but you'll go hands-on and learn how to use them to solve real business problems! Learn and master the most popular big data technologies in this comprehensive course, taught by a former engineer and senior manager from Amazon and IMDb . We'll go way beyond Hadoop itself, and dive into all sorts of distributed systems you may need to integrate with. Install and work with a real Hadoop installation right on your desktop with Hortonworks (now part of Cloudera) and the Ambari UI Manage big data on a cluster with HDFS and MapReduce Write programs to analyze data on Hadoop with Pig and Spark Store and query your data with Sqoop , Hive , MySQL , HBase , Cassandra , MongoDB , Drill , Phoenix , and Presto Design real-world systems using the Hadoop ecosystem Learn how your cluster is managed with YARN , Mesos , Zookeeper , Oozie , Zeppelin , and Hue Handle streaming data in real time with Kafka , Flume , Spark Streaming , Flink , and Storm Understanding Hadoop is a highly valuable skill for anyone working at companies with large amounts of data. Almost every large company you might want to work at uses Hadoop in some way, including Amazon, Ebay, Facebook, Google, LinkedIn, IBM, Spotify, Twitter, and Yahoo! And it's not just technology companies that need Hadoop; even the New York Times uses Hadoop for processing images. This course is comprehensive, covering over 25 different technologies in over 14 hours of video lectures . It's filled with hands-on activities and exercises, so you get some real experience in using Hadoop - it's not just theory. You'll find a range of activities in this course for people at every level. If you're a project manager who just wants to learn the buzzwords, there are web UI's for many of the activities in the course that require no programming knowledge. If you're comfortable with command lines, we'll show you how to work with them too. And if you're a programmer, I'll challenge you with writing real scripts on a Hadoop system using Scala, Pig Latin, and Python . You'll walk away from this course with a real, deep understanding of Hadoop and its associated distributed systems, and you can apply Hadoop to real-world problems. Plus a valuable completion certificate is waiting for you at the end! Please note the focus on this course is on application development, not Hadoop administration. Although you will pick up some administration skills along the way. Knowing how to wrangle "big data" is an incredibly valuable skill for today's top tech employers. Don't be left behind - enroll now! "The Ultimate Hands-On Hadoop... was a crucial discovery for me. I supplemented your course with a bunch of literature and conferences until I managed to land an interview. I can proudly say that I landed a job as a Big Data Engineer around a year after I started your course. Thanks so much for all the great content you have generated and the crystal clear explanations. " - Aldo Serrano "I honestly wouldnâ€š be where I am now without this course. Frank makes the complex simple by helping you through the process every step of the way. Highly recommended and worth your time especially the Spark environment. This course helped me achieve a far greater understanding of the environment and its capabilities. Frank makes the complex simple by helping you through the process every step of the way. Highly recommended and worth your time especially the Spark environment." - Tyler Buck
Big Data bootcamp will help you build the technical skills, mindset, and networks that launch amazing careers. Jigsawâ€™s innovative curriculum features an extensive range of valuable assets including preparatory skills support and an in-person, in-class immersive experience.
Big Data Hadoop Architect Program is a certification course that would help you build strong skill set in areas like Hadoop Development, Real time processing using Spark, and NoSQL database technology and transform you into a Hadoop Architect Expert. You would also be gain practical experience by implementing real life industry projects in the required Hadoop technologies.
It is necessary to have expert software skills to become a high-valued professional in the IT industry. As the industry is evolving fast, professionals need to remain up-to-date with the latest technologies while being proficient in the popular ones. If you have a comprehensive skill set, you can get a chance to work with some of the top organisations in the industry. | This Software Master Subscription course for 12 months provides a holistic learning approach to candidates who want to become experts in the IT industry. | The course provides high-quality learning modules that cover major technologies and tools across multiple domains in the IT industry. | Some of the prominent course areas are cloud computing, programming languages, mobile app development, graphic designing, networking, data analytics and business skills. | Candidates will have complete access to all the included courses for the whole course duration.
Software development is a significant part of IT industry. IT professionals need to keep their knowledge and skills updated to remain relevant in todayâ€™s highly evolving technological environment. Organisations pay attractive packages to professionals who are skilled in the latest technologies and tools. | This Software Master Subscription course for 6 months offers the right learning platform for IT professionals and software developers to improve their skills. | The course has been expertly designed to encompass all major domains and will help candidates to become a highly-valued IT professional. | Some of the top areas that this course covers are cloud computing, programming languages, graphic design, networking, data analytics and business soft skills. | You will have unlimited access to all the high-quality courses that are included for the whole course duration.
It is necessary to have expert analytics skills to become a high-valued professional in the Analytics & KPO industry. As the industry is evolving fast, professionals need to remain up-to-date with the latest technologies while being proficient in the popular ones. If you have a comprehensive skill set, you can get a chance to work with some of the top organisations in the industry. | This Analytics & Data Science Master Subscription course provides a holistic learning approach to candidates who want to become experts in the IT & Analytics industry. The course provides high-quality learning modules that cover major technologies and tools across multiple domains in the Analytics industry. | Some of the prominent course areas are Big data, Data Science, Machine Learning, Deep Learning, Power BI, Microsoft Excel, Data Modelling & more. | Candidates will have complete access to all the included courses for the whole course duration.
Spark Core provides basic I/O functionalities, distributed task dispatching, and scheduling. Resilient Distributed Datasets (RDDs) are logical collections of data partitioned across machines. RDDs can be created by referencing datasets in external storage systems, or by applying transformations on existing RDDs. In this course, you will learn how to improve Spark's performance and work with Data Frames and Spark SQL. Spark Streaming leverages Spark's language-integrated API to perform streaming analytics. This design enables the same set of application code written for batch processing to join streams against historical data, or run ad-hoc queries on stream state. In this course, you will learn how to work with different input streams, perform transformations on streams, and tune up performance. MLlib is Spark's machine learning library. GraphX is Spark's API for graphs and graph-parallel computation. SparkR exposes the API and allows users to run jobs from the R shell on a cluster. In this course, you will learn how to work with each of these libraries.
Learn Hadoop and master how to organize & monetize your big data with this unique project-based online Hadoop training. Created by experts from the industry, our course is designed to not only provide you a comprehensive guide for learning Hadoop and Big Data, but it also breaks down related concepts and technologies associated with the software into meaningful tasks and concepts. | With our video course, you not only gain theoretical knowledge about the technologies but also hands on experience using the projects.
PG Diploma in Data Science program is designed to get the learners employed by providing them with a broad understanding of the basic and advanced concepts of Data Science. The Data Science training will enable learners to implement Big Data techniques using tools using R, Excel, Tableau, SQL, NoSQL, Hadoop, Pig, Hive, Apache Spark and Storm. The program is spread over 11 months (and is designed in such a way that students will be job-ready by the end of it). After completing the Data Science diploma, you will be considered as a strong and competent data scientist.
Apache Spark Fundamentals course introduces to the various components of the spark framework to efficiently process, visualize and analyze data. The course takes you through spark applications using Python, Scala and Java. You will also learn about the apache spark programming fundamentals like resilient distributed datasets and check which operations to be used to do a transformation operation on the RDD. This will also show you how to save and load data from different data sources like different type of files, RDBMS databases and NO-SQL. At the end of the course, you will explore effective spark application and execute it on Hadoop cluster to make informed business decisions.