Vikas Kumar Dwivedi

IT professional with experience specializing in Big Data(Hadoop) and Cloud(AWS). Seeking an opportunity to work with a growth oriented organization on a real time software development project; a position in a high quality engineering environment where my resourceful experience and academic skills will add value to organizational operations and at the same time help me in improving my skills furthermore.

Key Skills

Computer Literacy
Continuous Learner
Analytical Skills
TEAM MANAGEMENT
Interpersonal Skills

Professional Experience

Jun 2018
Present
Senior System Engineer
Infosys Gurgaon, IN
Infosys Limited, is an Indian multinational corporation that provides business consulting, information technology and outsourcing services. Infosys is the second-largest Indian IT company.
 Technical Expertise: Hadoop – Hbase, HDFS, Hive, Spark, Storm, Zookeeper, Kafka, Oozie, Impala, Sqoop. Java
Responsibilities:
  • The project was dividing into three phase analysis of messages using kafka component, streaming using spark jobs and finally resulting into storage of hadoop DB Hbase. 
  • Kaka : Using Publisher/Consumer/Console-consumer commands for message fetching producing into GSTN portal separate pages. 
  • Spark: Using streaming jobs of summary for looking GS logs and events per active batches. - Storm: Analyses of topologies through worker logs and dividing the bolts with count of executors on the basis of tasks assigned to it. 
  • Hive: Running hive queries over schema designed over GSTN portal database and analyzing the data and providing desired result through queries. :: Production based Projects 
  • Hbase & HDFS: Stirage components for data into Hadoop database using hbase shell commands. 
  • Oozie: Making hive scripts into automation process using oozie jobs for GSTN records. 
  • Impala: Working over impala queries to validate the records fetched using hive scripts. 
Jun 2017
Jun 2018
System Engineer
Infosys Bangalore Urban, IN
Infosys Limited, is an Indian multinational corporation that provides business consulting, information technology and outsourcing services. Infosys is 596th largest public company in the world
 Technical Expertise: AWS – S3, IAM, EBS, EC2 instance, Dynamo DB, API Gateway, Snow Family. Python
Responsibilities: 
  • The project was dividing into three phase Data integrity, data compilation and data pipelining. The major focus was on News Corp Discusses Plans for Migrating Data Centers to AWS Servers using Terraform software and Jenkins over GitHub platform. 
  • PoC on Text-to-Speech application using Polly service.
  • PoC on Autoscaling to increase the number of remote instances of services. 
  • PoV on Bulk Data transfer using Snow family.
Dec 2016
May 2017
System Engineer Trainee
Infosys Mysore, IN
Infosys Training is one of the best which is provided in Mysuru. Helps in learning Python, Database and basic IT concepts. It gives freshers opportunities and a life long amazing training experience.
 Technical Expertise: Basic J2EE, Hibernate, HTML, Angular JS, CSS, Bootstrap
Responsibilities:
  • This project was implemented during my internship in Infosys Pvt. Ltd.
  • The project focused for maintaining the information for calculating the efforts made by each member on the project as well as the overall project growth.
  • The various modules of project includes super admin portal, Manager(s) of the project, Team member(s) of the project.
  • The prime objective of this portal is to create a common platform were the manager and the team member(s) can easily see their individual growth as well as the project growth on the basis of stories completed. 
Oct 2016
Dec 2016
Hadoop Developer
Besant Technology Chennai, IN
Being the leader in IT Software Training sector Besant Technologies holds the best and inevitable place in short time. To manage a company is a social process that processes consist of planning, control, co-ordination and motivation.
 Technical Expertise: Basic Hadoop components, Java
Responsibilities:
  • The project HADOOP TECHNOLOGY is used in a data world which is cluster computing framework. 
  • Apache Hadoop is a software framework that supports data-intensive distributed applications under a free license. Which was implemented for data analysis for IMP employees? 
  • Anything and everything that we do in the internet is becoming a source of business information for the organizations across the globe. 
  • The project focused over live updates of records by analyzing using Map reduce and storage using HDFS.

Education

Jun 2013
Jun 2017
Bachelor's Degree in Information Technology in University Institute of Engineering and Technology
Panjab University
Mar 2011
Mar 2013
Higher Secondary in Science in Kendriya Vidhayalaya Chandigarh
CBSE

Certifications

2017
AWS Technical Professional
AWS Partner with Infosys

Achievements

2020
Employee of the year
Nominated and rewarded as the "Top Employee" for the current production project with LOA(Letter of Appreciation) from manager and immediate lead.
2017
Top performer in Bachelor's Degree
Secured 1st position in B.E. degree and was rewarded with memento and bonus rewards through check from the university.

Quote

The harder you work for something, the greater you’ll feel when you achieve it.

Debra DiPietro

Hobbies & Interests

  • Adventure Sports
  • Basketball
  • Computers
  • Cricket

Languages

English
(Fluent)
Hindi
(Native)
German
(Basic)

Career Aspiration

I have always set my priorities in life. Today I can say that those priorities have rewarded me with the opportunity to sit here in a reputed organization. I look forward to working here with a promising attitude to attain new heights in my career whilst taking forward the goal of this organization.

Get in touch with Vikas Kumar