Posted:5 days ago|
Platform:
Work from Office
Full Time
Work Location : Bangalore (CV Ramen Nagar location) Notice Period : Immediate - 30 days Mandatory Skills : Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities : - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills : - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science
Qcentrio
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Bengaluru
8.0 - 14.0 Lacs P.A.
Bengaluru
8.0 - 12.0 Lacs P.A.
Bengaluru
8.0 - 12.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
Hyderabad
8.0 - 12.0 Lacs P.A.
5.0 - 9.0 Lacs P.A.
8.0 - 13.0 Lacs P.A.
Pune
13.0 - 14.0 Lacs P.A.
Hyderabad, Pune, Bengaluru
12.0 - 18.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.