Posted:21 hours ago|
Platform:
Work from Office
Full Time
Key Responsibilities Design and develop high-volume, data engineering solutions for mission-critical systems with quality. Making enhancements to various applications that meets business and auditing requirements. Research and evaluate alternative solutions and make recommendations on improving the product to meet business and information risk requirements. Evaluate service level issues and suggested enhancements to diagnose and address underlying system problems and inefficiencies. Participate in full development lifecycle activities for the product (coding, testing, release activities). Support Release activities on weekends as required. Support any application issues reported during weekends. Coordinating day-To-day activities for multiple projects with onshore and offshore team members. Ensuring the availability of platform in lower environments Required Qualifications 7+ years of overall IT experience, which includes hands on experience in Big Data technologies. Mandatory - Hands on experience in Python and PySpark. Build pySpark applications using Spark Dataframes in Python using Jupyter notebook and PyCharm(IDE). Worked on optimizing spark jobs that processes huge volumes of data. Hands on experience in version control tools like Git. Worked on Amazon s Analytics services like Amazon EMR, Amazon Athena, AWS Glue. Worked on Amazon s Compute services like Amazon Lambda, Amazon EC2 and Amazon s Storage service like S3 and few other services like SNS. Experience/knowledge of bash/shell scripting will be a plus. Has built ETL processes to take data, copy it, structurally transform it etc. involving a wide variety of formats like CSV, TSV, XML and JSON. Experience in working with fixed width, delimited , multi record file formats etc. Good to have knowledge of datawarehousing concepts - dimensions, facts, schemas- snowflake, star etc. Have worked with columnar storage formats- Parquet,Avro,ORC etc. Well versed with compression techniques - Snappy, Gzip. Good to have knowledge of AWS databases (atleast one) Aurora, RDS, Redshift, ElastiCache, DynamoDB. Hands on experience in tools like Jenkins to build, test and deploy the applications Awareness of Devops concepts and be able to work in an automated release pipeline environment. Excellent debugging skills. Preferred Qualifications Experience working with US Clients and Business partners. Knowledge on Front end frameworks. Exposure to BFSI domain is a good to have. Hands on experience on any API Gateway and management platform. AWMPO AWMPS Presidents Office Technology
Ameriprise Financial
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
25.0 - 30.0 Lacs P.A.
Andhra Pradesh
Salary: Not disclosed
Bengaluru
5.0 - 9.0 Lacs P.A.
Chennai
7.0 - 11.0 Lacs P.A.
Pune
15.0 - 30.0 Lacs P.A.
9.0 - 18.0 Lacs P.A.
Hyderabad, Gurugram, Bengaluru
25.0 - 30.0 Lacs P.A.
Hyderabad, Gurugram, Bengaluru
25.0 - 30.0 Lacs P.A.
Hyderabad
25.0 - 30.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.