Jobs
Interviews

79 Step Functions Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

You will be part of a team responsible for developing a next-generation Data Analytics Engine that converts raw market and historical data into actionable insights for the electronics supply chain industry. This platform processes high-volume data from suppliers, parts, and trends to provide real-time insights and ML-driven applications. We are seeking an experienced Lead or Staff Data Engineer to assist in shaping and expanding our core data infrastructure. The ideal candidate should have a strong background in designing and implementing scalable ETL pipelines and real-time data systems in AWS and open-source environments such as Airflow, Spark, and Kafka. This role involves taking technical ownership, providing leadership, improving our architecture, enforcing best practices, and mentoring junior engineers. Your responsibilities will include designing, implementing, and optimizing scalable ETL pipelines using AWS-native tools, migrating existing pipelines to open-source orchestration tools, leading data lake and data warehouse architecture design, managing CI/CD workflows, implementing data validation and quality checks, contributing to Infrastructure as Code, and offering technical mentorship and guidance on architectural decisions. To qualify for this role, you should have at least 8 years of experience as a Data Engineer or similar role with production ownership, expertise in AWS tools, deep knowledge of open-source data stack, strong Python programming skills, expert-level SQL proficiency, experience with CI/CD tools, familiarity with Infrastructure as Code, and the ability to mentor engineers and drive architectural decisions. Preferred qualifications include a background in ML/AI pipelines, experience with serverless technologies and containerized deployments, and familiarity with data observability tools and alerting systems. A Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field is preferred. In return, you will have the opportunity to work on impactful supply chain intelligence problems, receive mentorship from experienced engineers and AI product leads, work in a flexible and startup-friendly environment, and enjoy competitive compensation with opportunities for career growth.,

Posted 17 hours ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You should have 3-5 years of experience in writing and debugging intermediate to advance level Python code with a good understanding of concepts related to OOPS, APIs, and SQL Databases. Additionally, you should possess a strong grasp of fundamental basics of Generative AI, large language models (LLMs) pipelines like RAG, Open AI GPT models, and experience in NLP and Langchain. It is essential to be familiar with the AWS environment and services like S3, lambda, Step Functions, CloudWatch, etc. You should also have excellent analytical and problem-solving skills and be capable of working independently as well as collaboratively in a team-oriented environment. An analytical mind and business acumen are also important qualities for this role. You should demonstrate the ability to engage with client stakeholders at multiple levels and provide consultative solutions across different domains. It would be beneficial to have familiarity with Python libraries and frameworks such as Pandas, Scikit-learn, PyTorch, TensorFlow, BERT, GPT, or similar models, along with experience in Deep Learning and machine learning.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

This role requires you to be adept at troubleshooting, debugging, and working within a Cloud environment. You should be familiar with Agile and other development methodologies. Your responsibilities will include creating lambda functions with all the necessary security measures in place using AWS Lambda. You must demonstrate proficiency in Java & Node JS by developing services and conducting unit and integration testing. It is essential to have a strong understanding of security best practices such as using IAM Roles, KMS, and Pseudonymization. You should be able to define services on Swagger Hub and implement serverless approaches using AWS Lambda, including the Serverless Application Model (AWS SAM). Hands-on experience with RDS, Kafka, ELB, Secret Manager, S3, API Gateway, CloudWatch, and Event Bridge services is required. You should also be knowledgeable in writing unit test cases using the Mocha framework and have experience with Encryption & Decryption of PII data and file on Transit and at Rest. Familiarity with CDK (Cloud Development Kit) and creating SQS/SNS, DynamoDB, API Gateway using CDK is preferred. You will be working on a serverless stack involving Lambda, API Gateway, Step functions, and coding in Java / Node JS. Advanced networking concepts like Transit Gateway, VPC endpoints, and multi-account connectivity are also part of the role. Strong troubleshooting and debugging skills are essential, along with excellent problem-solving abilities and attention to detail. Effective communication skills and the ability to work in a team-oriented, collaborative environment are crucial for success in this role. Virtusa is a company that values teamwork, quality of life, and professional and personal development. By joining Virtusa, you become part of a global team that focuses on your growth and provides exciting projects, opportunities, and exposure to state-of-the-art technologies throughout your career. Collaboration and fostering excellence are at the core of Virtusa's values, offering a dynamic environment for great minds to thrive and innovate.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an AWS Senior Data Engineer (SDE) at Infosys in India, you will be responsible for working on various technologies and tools related to cloud data engineering. Your role will involve expertise in SQL, Pyspark, API endpoint ingestion, Glue, S3, Redshift, Step Functions, Lambda, Cloudwatch, AppFlow, CloudFormation, and administrative tasks related to cloud services. Additionally, you will be expected to have knowledge of SDLF & OF frameworks, S3 ingestion patterns, and exposure to Git, Jfrog, ADO, SNOW, Visual Studio, DBeaver, and SF inspector. Your primary focus will be on leveraging these technologies to design, develop, and maintain data pipelines, ensuring efficient data processing and storage on the cloud platform. The ideal candidate for this position should have a strong background in cloud data engineering, familiarity with AWS services, and a proactive attitude towards learning and implementing new technologies. Excellent communication skills and the ability to work effectively within a team are essential for success in this role.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

Sykatiya Technology Pvt Ltd is a leading Semiconductor Industry innovator committed to leveraging cutting-edge technology to solve complex problems. We are currently looking for a highly skilled and motivated Data Scientist to join our dynamic team and contribute to our mission of driving innovation through data-driven insights. As the Lead Data Scientist and Machine Learning Engineer at Sykatiya Technology Pvt Ltd, you will play a crucial role in analyzing large datasets to uncover patterns, develop predictive models, and implement AI/ML solutions. Your responsibilities will include working on projects involving neural networks, deep learning, data mining, and natural language processing (NLP) to drive business value and enhance our products and services. Key Responsibilities: - Lead the design and implementation of machine learning models and algorithms to address complex business problems. - Utilize deep learning techniques to enhance neural network models and enhance prediction accuracy. - Conduct data mining and analysis to extract actionable insights from both structured and unstructured data. - Apply natural language processing (NLP) techniques for advanced text analytics. - Develop and maintain end-to-end data pipelines, ensuring data integrity and reliability. - Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions. - Mentor and guide junior data scientists and engineers in best practices and advanced techniques. - Stay updated with the latest advancements in AI/ML, neural networks, deep learning, data mining, and NLP. Technical Skills: - Proficiency in Python and its libraries such as NumPy, pandas, sci-kit-learn, TensorFlow, Keras, and PyTorch. - Strong understanding of machine learning algorithms and techniques. - Extensive experience with neural networks and deep learning frameworks. - Hands-on experience with data mining and analysis techniques. - Proficiency in natural language processing (NLP) tools and libraries like NLTK, spaCy, and transformers. - Proficiency in Big Data Technologies including Sqoop, Hadoop, HDFS, Hive, and PySpark. - Experience with Cloud Platforms such as AWS services like S3, Step Functions, EventBridge, Athena, RDS, Lambda, and Glue. - Strong knowledge of Database Management systems like SQL, Teradata, MySQL, PostgreSQL, and Snowflake. - Familiarity with Other Tools like ExactTarget, Marketo, SAP BO, Agile, and JIRA. - Strong Analytical Skills to analyze large datasets and derive actionable insights. - Excellent Problem-Solving Skills with the ability to think critically and creatively. - Effective Communication Skills and teamwork abilities to collaborate with various stakeholders. Experience: - At least 8 to 12 years of experience in a similar role.,

Posted 1 day ago

Apply

10.0 - 14.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role - Cloud Architect Analytics & Data Products Were looking for a Cloud Architect / Lead to design, build, and manage scalable AWS infrastructure that powers our analytics and data product initiatives. This role focuses on automating infrastructure provisioning , application/API hosting , and enabling data and GenAI workloads through a modern, secure cloud environment. Key Responsibilities Design and provision AWS infrastructure using Terraform or AWS CloudFormation to support evolving data product needs. Develop and manage CI/CD pipelines using Jenkins , AWS Code Pipeline , Code Build , or GitHub Actions . Deploy and host internal tools, APIs, and applications using ECS , EKS , Lambda , API Gateway , and ELB . Provision and support analytics and data platforms using S3 , Glue , Redshift , Athena , Lake Formation , and orchestration tools like Step Functions or Apache Airflow (MWAA) . Implement cloud security, networking, and compliance using IAM , VPC , KMS , CloudWatch , CloudTrail , and AWS Config . Collaborate with data engineers, ML engineers, and analytics teams to align infrastructure with application and data product requirements. Support GenAI infrastructure, including Amazon Bedrock , Sage Maker , or integrations with APIs like Open AI . Requirements 10-14 years of experience in cloud engineering, DevOps, or cloud architecture roles. Strong hands-on expertise with the AWS ecosystem and tools listed above. Proficiency in scripting (e.g., Python , Bash ) and infrastructure automation. Experience deploying containerized workloads using Docker , ECS , EKS , or Fargate . Familiarity with data engineering and GenAI workflows is a plus. AWS certifications (e.g., Solutions Architect , DevOps Engineer ) are preferred. Show more Show less

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an AWS Senior Data Engineer at our organization, you will be responsible for working with various technologies and tools to support the data engineering activities. Your primary tasks will include utilizing SQL for data querying and manipulation, developing data processing pipelines using Pyspark, and integrating data from API endpoints. Additionally, you will be expected to work with AWS services such as Glue for ETL processes, S3 for data storage, Redshift for data warehousing, Step Functions for workflow automation, Lambda for serverless computing, Cloudwatch for monitoring, and AppFlow for data integration. You should have experience with Cloud formation and administrative roles, as well as knowledge of SDLF & OF frameworks for data lifecycle management. Understanding S3 ingestion patterns and version control using Git is essential for this role. Exposure to tools like Jfrog, ADO, SNOW, Visual Studio, DBeaver, and SF inspector will be beneficial in supporting your data engineering tasks effectively. Your role will involve collaborating with cross-functional teams to ensure the successful implementation of data solutions within the AWS environment.,

Posted 3 days ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Company Description Epsilon is an all-encompassing global marketing innovator, supporting 15 of the top 20 global brands. We provide unrivaled data intelligence and customer insights, world-class technology including loyalty, email and CRM platforms and data-driven creative, activation and execution. Epsilon's digital media arm, Conversant, is a leader in personalized digital advertising and insights through its proprietary technology and trove of consumer marketing data, delivering digital marketing with unprecedented scale, accuracy and reach through personalized media programs and through CJ Affiliate by Conversant, one of the world's largest affiliate marketing networks. Together, we bring personalized marketing to consumers across offline and online channels, at moments of interest, that help drive business growth for brands. Recognized by Ad Age as the #1 World's Largest CRM/Direct Marketing Agency Network, #1 Largest U.S. Agency from All Disciplines, #1 Largest U.S. CRM/Direct Marketing Agency Network and #1 Largest U.S. Mobile Marketing Agency, Epsilon employs over 8,000 associates in 70 offices worldwide. Epsilon is part of Alliance Data, a Fortune 500's and Fortune 100 Best Places to Work For a company. For more information, visit www.epsilon.com and follow us on Twitter @EpsilonMktg. Job Description About BU The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilon's success story. Why we are looking for you We are looking for Senior Software Engineer to work on groundbreaking multichannel SaaS Digital Marketing Platform that focuses on uniquely identify the customer's patterns, effectively interact with them across channels and achieve a positive return on marketing investment (ROMI). The platform helps consolidate and integrates the features and functionality typically found in stand-alone services and channel-specific messaging platforms to give marketers a tightly integrated, easily orchestrated, insights-driven, cross channel marketing capability. Primary role of the Senior Software Engineer is to envision and build internet scale services on Cloud using Java and distributed technologies with 60-40 involvement in backend development with Java and frontend development using Angular. Responsible for development and maintenance of applications with technologies involving Java and Distributed technologies. Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements. Assist in the development, and documentation of softwares objectives, deliverables, and specifications in collaboration with internal users and departments. Collaborate with QA team to define test cases, metrics, and resolve questions about test results. Assist in the design and implementation process for new products, research and create POC for possible solutions. Develop components based on business and/or application requirements Create unit tests in accordance with team policies & procedures Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process Create Value-adds that would contribute to Cost Optimizations/ Scalability/ Reliability/Secure solutions What you will enjoy in this role Tech Stack: Our integrated suite of modular products is designed to help deliver personalized experiences and drive meaningful outcomes. Our tech stack caters to a fusion of data and technology with SaaS offerings developed as a Cloud-first approach. Here, a solid understanding of software security practices including user authentication and authorization and being data-savvy would be key. You should also come with the ability to leverage best practices in design patterns, and design algorithms for software development that focus on high quality and agility. You must also have a good understanding of Agile Methodologies like SCRUM. You can refer this article also. What you will do Be responsible for development and maintenance of applications with technologies involving Java and Distributed technologies. Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements. Assist in the development, and documentation of softwares objectives, deliverables, and specifications in collaboration with internal users and departments. Collaborate with QA team to define test cases, metrics, and resolve questions about test results. Assist in the design and implementation process for new products, research and create POC for possible solutions. Develop components based on business and/or application requirements Create unit tests in accordance with team policies & procedures Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process Create Value-adds that would contribute to Cost Optimizations/ Scalability/ Reliability/Secure solutions Qualifications Bachelor's degree or equivalent in computer science 6+ years of experience in Java/Angular/SQL/ AWS/Microservices Preferred knowledge/experience in the following technologies 2 + years of UI Technologies like Angular 2 or > 1 + year of experience in Cloud computing like AWS or Azure or GCP or PCF or OCI Experience in following Tools: Eclipse, Maven, Gradle, DB tools, Bitbucket/JIRA/Confluence Can develop SOA services and good knowledge of REST API and Micro service architectures Solid knowledge of web architectural and design patterns Understands software security practices including user authentication and authorization, data validation and an understanding of common DOS and SQL injection techniques. Familiar with profiling, code coverage, logging, common IDE's and other development tools. Familiar with Agile Methodologies SCRUM and Strong communication skills (verbal and written) Ability to work within tight deadlines and effectively prioritize and execute tasks in a high-pressure environment. Demonstrated verbal and written communication skills, and ability to interface with Business, Analytics and IT organizations Ability to work effectively in short-cycle, team oriented environment, managing multiple priorities and tasks Ability to identify non-obvious solutions to complex problems

Posted 5 days ago

Apply

9.0 - 12.0 years

14 - 24 Lacs

Gurugram

Remote

We are looking for an experienced Senior Data Engineer to lead the development of scalable AWS-native data lake pipelines with a strong focus on time series forecasting and upsert-ready architectures. This role requires end-to-end ownership of the data lifecycle, from ingestion to partitioning, versioning, and BI delivery. The ideal candidate must be highly proficient in AWS data services, PySpark, versioned storage formats like Apache Hudi/Iceberg, and must understand the nuances of data quality and observability in large-scale analytics systems. Role & responsibilities Design and implement data lake zoning (Raw Clean Modeled) using Amazon S3, AWS Glue, and Athena. Ingest structured and unstructured datasets including POS, USDA, Circana, and internal sales data. Build versioned and upsert-friendly ETL pipelines using Apache Hudi or Iceberg. Create forecast-ready datasets with lagged, rolling, and trend features for revenue and occupancy modelling. Optimize Athena datasets with partitioning, CTAS queries, and metadata tagging. Implement S3 lifecycle policies, intelligent file partitioning, and audit logging. Build reusable transformation logic using dbt-core or PySpark to support KPIs and time series outputs. Integrate robust data quality checks using custom logs, AWS CloudWatch, or other DQ tooling. Design and manage a forecast feature registry with metrics versioning and traceability. Collaborate with BI and business teams to finalize schema design and deliverables for dashboard consumption. Preferred candidate profile 9-12 years of experience in data engineering. Deep hands-on experience with AWS Glue, Athena, S3, Step Functions, and Glue Data Catalog. Strong command over PySpark, dbt-core, CTAS query optimization, and partition strategies. Working knowledge of Apache Hudi, Iceberg, or Delta Lake for versioned ingestion. Experience in S3 metadata tagging and scalable data lake design patterns. Expertise in feature engineering and forecasting dataset preparation (lags, trends, windows). Proficiency in Git-based workflows (Bitbucket), CI/CD, and deployment automation. Strong understanding of time series KPIs, such as revenue forecasts, occupancy trends, or demand volatility. Data observability best practices including field-level logging, anomaly alerts, and classification tagging. Experience with statistical forecasting frameworks such as Prophet, GluonTS, or related libraries. Familiarity with Superset or Streamlit for QA visualization and UAT reporting. Understanding of macroeconomic datasets (USDA, Circana) and third-party data ingestion. Independent, critical thinker with the ability to design for scale and evolving business logic. Strong communication and collaboration with BI, QA, and business stakeholders. High attention to detail in ensuring data accuracy, quality, and documentation. Comfortable interpreting business-level KPIs and transforming them into technical pipelines.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

You should have experience in designing and building serverless data lake solutions using a layered components architecture. This includes expertise in Ingestion, Storage, processing, Security & Governance, Data cataloguing & Search, and Consumption layer. Proficiency in AWS serverless technologies like Lake formation, Glue, Glue Python, Glue Workflows, Step Functions, S3, Redshift, Quick sight, Athena, AWS Lambda, and Kinesis is required. It is essential to have hands-on experience with Glue. You must be skilled in designing, building, orchestrating, and deploying multi-step data processing pipelines using Python and Java. Experience in managing source data access security, configuring authentication and authorization, and enforcing data policies and standards is also necessary. Additionally, familiarity with AWS Environment setup and configuration is expected. A minimum of 6 years of relevant experience with at least 3 years in building solutions using AWS is mandatory. The ability to work under pressure and a commitment to meet customer expectations are essential traits for this role. If you meet these requirements and are ready to take on this challenging opportunity, please reach out to hr@Stanratech.com.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

A career at HARMAN Automotive offers you the opportunity to be part of a global, multi-disciplinary team dedicated to harnessing the power of technology to shape the future. We empower you to accelerate your professional growth and make a difference by: - Engineering cutting-edge audio systems and integrated technology platforms that enhance the driving experience. - Fostering innovation through collaborative efforts that combine in-depth research, design excellence, and engineering prowess. - Driving advancements in in-vehicle infotainment, safety, efficiency, and overall enjoyment for users. About The Role: We are looking for a skilled Python Backend Developer with 3 to 6 years of experience in building scalable and secure backend systems using AWS services. In this role, you will be instrumental in: - Designing and implementing microservices architecture and cloud-native solutions. - Integrating diverse data sources into a unified system to ensure data consistency and security. What You Will Do: Your responsibilities will include: - Backend Development: Creating scalable backend systems using Python and frameworks like Flask or Django. - Microservices Architecture: Developing and deploying microservices-based systems with AWS services like SQS, Step Functions, and API Gateway. - Cloud-Native Solutions: Building cloud-native solutions utilizing AWS services such as Lambda, CloudFront, and IAM. - Data Integration: Integrating multiple data sources into a single system while maintaining data integrity. - API Development: Designing and implementing RESTful/SOAP APIs using API Gateway and AWS Lambda. What You Need To Be Successful: To excel in this role, you should possess: - Technical Skills: Proficiency in Python backend development, JSON data handling, and familiarity with AWS services. - AWS Services: Knowledge of various AWS services including SQS, Step Functions, IAM, CloudFront, and API Gateway. - Security and Authentication: Understanding of identity management, authentication protocols like OAuth 2.0 and OIDC. - Data Management: Experience with ORM frameworks like SQLAlchemy or Django ORM. - Collaboration and Testing: Ability to collaborate effectively and work independently when needed, along with familiarity with testing tools. Bonus Points if You Have: Additional experience with AWS ECS, VPC, serverless computing, and DevOps practices would be advantageous. What Makes You Eligible: We are looking for individuals with relevant experience in backend development, strong technical expertise, problem-solving abilities, and effective collaboration and communication skills. What We Offer: Join us for a competitive salary and benefits package, opportunities for professional growth, a dynamic work environment, access to cutting-edge technologies, recognition for outstanding performance, and the chance to collaborate with a renowned German OEM. You Belong Here: At HARMAN, we value diversity, inclusivity, and empowerment. We encourage you to share your ideas, voice your perspective, and be yourself in a supportive culture that celebrates uniqueness. We are committed to your ongoing learning and development, providing training and education opportunities for you to thrive in your career. About HARMAN: With a legacy of innovation dating back to the 1920s, HARMAN continues to redefine technology across automotive, lifestyle, and digital transformation solutions. Our portfolio of iconic brands delivers exceptional experiences, setting new standards in engineering and design for our customers and partners worldwide. If you are ready to drive innovation and create lasting impact, we invite you to join our talent community at HARMAN Automotive.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Platform developer at Barclays, you will play a crucial role in shaping the digital landscape and enhancing customer experiences. Leveraging cutting-edge technology, you will work alongside a team of engineers, business analysts, and stakeholders to deliver high-quality solutions that meet business requirements. Your responsibilities will include tackling complex technical challenges, building efficient data pipelines, and staying updated on the latest technologies to continuously enhance your skills. To excel in this role, you should have hands-on coding experience in Python, along with a strong understanding and practical experience in AWS development. Experience with tools such as Lambda, Glue, Step Functions, IAM roles, and various AWS services will be essential. Additionally, your expertise in building data pipelines using Apache Spark and AWS services will be highly valued. Strong analytical skills, troubleshooting abilities, and a proactive approach to learning new technologies are key attributes for success in this role. Furthermore, experience in designing and developing enterprise-level software solutions, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, Kinesis, and Glue Streaming will be advantageous. Effective communication and collaboration skills are essential to interact with cross-functional teams and document best practices. Your role will involve developing and delivering high-quality software solutions, collaborating with various stakeholders to define requirements, promoting a culture of code quality, and staying updated on industry trends. Adherence to secure coding practices, implementation of effective unit testing, and continuous improvement are integral parts of your responsibilities. As a Data Platform developer, you will be expected to lead and supervise a team, guide professional development, and ensure the delivery of work to a consistently high standard. Your impact will extend to related teams within the organization, and you will be responsible for managing risks, strengthening controls, and contributing to the achievement of organizational objectives. Ultimately, you will be part of a team that upholds Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, while embodying the Barclays Mindset of Empower, Challenge, and Drive in your daily interactions and work ethic.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Choosing Capgemini means selecting a company where you will have the opportunity to shape your career according to your preferences. You will be part of a collaborative global community of colleagues who will support and inspire you, enabling you to rethink what is achievable. Join us in assisting the world's leading organizations in realizing the potential of technology and constructing a more sustainable and inclusive world. Your role will involve developing, designing, and implementing Enterprise Data Management Consolidation (EDMCS), Enterprise Profitability & Cost Management Cloud Services (EPCM), and Oracle Integration Cloud (OIC). You will be responsible for the full life cycle implementation of Oracle EPM Cloud, including creating forms, OIC integrations, and complex Business Rules. Understanding the dependencies and interrelationships between various components of Oracle EPM Cloud will be crucial. You will stay updated on the Oracle EPM roadmap and key functionality to identify opportunities for enhancing the current process within the entire Financials ecosystem. Collaboration with FP&A will be essential to facilitate the Planning, Forecasting, and Reporting process for the organization. Additionally, you will be tasked with creating and maintaining system documentation, both functional and technical. Your profile should include experience in implementing EDMCS Modules and a proven ability to collaborate with internal clients in an agile manner, utilizing design thinking approaches. Familiarity with Python, AWS Cloud (Lambda, Step functions, EventBridge, etc.) is preferred. At Capgemini, you can customize your career path as we offer a variety of career opportunities and internal growth within the Capgemini group. You will receive personalized career guidance from our leaders and comprehensive wellness benefits, including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support through flexible work arrangements. Additionally, you will have access to learning opportunities on one of the industry's largest digital learning platforms, with over 250,000 courses and numerous certifications to choose from. Capgemini is a global partner for business and technology transformation, aiding organizations in accelerating their transition to a digital and sustainable world while delivering tangible impact for enterprises and society. With a diverse team of 340,000 members across more than 50 countries, Capgemini has built a strong reputation over its 55-year history. Clients trust Capgemini to unlock technology's value to address their full range of business needs, offering end-to-end services and solutions that leverage strengths from strategy and design to engineering. Capgemini is recognized for its market-leading capabilities in AI, generative AI, cloud and data, combined with deep industry expertise and a robust partner ecosystem.,

Posted 1 week ago

Apply

2.0 - 6.0 years

8 - 18 Lacs

Gurugram

Remote

Role Characteristics: Analytics team provides analytical support to multiple stakeholders (Product, Engineering, Business development, Ad operations) by developing scalable analytical solutions, identifying problems, coming up with KPIs and monitor those to measure impact/success of product improvements/changes and streamlining processes. This will be an exciting and challenging role that will enable you to work with large data sets, expose you to cutting edge analytical techniques, work with latest AWS analytics infrastructure (Redshift, s3, Athena, and gain experience in the usage of location data to drive businesses. Working in a dynamic start up environment will give you significant opportunities for growth within the organization. A successful applicant will be passionate about technology and developing a deep understanding of human behavior in the real world. They would also have excellent communication skills, be able to synthesize and present complex information and be a fast learner. You Will: Perform root cause analysis with minimum guidance to figure out reasons for sudden changes/abnormalities in metrics Understand objective/business context of various tasks and seek clarity by collaborating with different stakeholders (like Product, Engineering) Derive insights and putting them together to build a story to solve a given problem Suggest ways for process improvements in terms of script optimization, automating repetitive tasks Create and automate reports and dashboards through Python to track certain metrics basis given requirements Automate reports and dashboards through Python Technical Skills (Must have) B.Tech degree in Computer Science, Statistics, Mathematics, Economics or related fields 4-6 years of experience in working with data and conducting statistical and/or numerical analysis Ability to write SQL code Scripting/automation using python Hands on experience in data visualisation tool like Looker/Tableau/Quicksight Basic to advance level understanding of statistics Other Skills (Must have) Be willing and able to quickly learn about new businesses, database technologies and analysis techniques Strong oral and written communication Understanding of patterns/trends and draw insights from those Preferred Qualifications (Nice to have) Experience working with large datasets Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3) Hands on experience on AWS services like lambda, step functions, Glue, EMR + exposure to pyspark What we offer At GroundTruth, we want our employees to be comfortable with their benefits so they can focus on doing the work they love. Parental leave- Maternity and Paternity Flexible Time Offs (Earned Leaves, Sick Leaves, Birthday leave, Bereavement leave & Company Holidays) In Office Daily Catered Lunch Fully stocked snacks/beverages Health cover for any hospitalization. Covers both nuclear family and parents Tele-med for free doctor consultation, discounts on health checkups and medicines Wellness/Gym Reimbursement Pet Expense Reimbursement Childcare Expenses and reimbursements Employee assistance program Employee referral program Education reimbursement program Skill development program Cell phone reimbursement (Mobile Subsidy program) Internet reimbursement Birthday treat reimbursement Employee Provident Fund Scheme offering different tax saving options such as VPF and employee and employer contribution up to 12% Basic Creche reimbursement Co-working space reimbursement NPS employer match Meal card for tax benefit Special benefits on salary account We are an equal opportunity employer and value diversity, inclusion and equity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, supported and inspired by a collaborative community of colleagues worldwide, and able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, inclusive world. Your Role: - Implementing and supporting Enterprise Planning & Budgeting Cloud Services (EPBCS) modules: Financials, Workforce, Capital, and Projects. - Experience in Enterprise Data Management Consolidation (EDMCS) and Enterprise Profitability & Cost Management Cloud Services (EPCM). - Proficiency in Oracle Integration Cloud (OIC) and Oracle EPM Cloud Implementation. - Creating forms, OIC Integrations, and complex Business Rules. - Understanding dependencies and interrelationships between various components of Oracle EPM Cloud. - Keeping abreast of Oracle EPM roadmap and key functionality to identify enhancement opportunities within the Financials ecosystem. - Collaborating with internal clients in an agile manner, leveraging design thinking approaches. - Collaborating with FP&A to facilitate Planning, Forecasting, and Reporting processes. - Creating and maintaining system documentation, both functional and technical. Your Profile: - Proven ability to collaborate with internal clients in an agile manner. - Experience with Python and AWS Cloud (Lambda, Step functions, EventBridge, etc.) preferred. - Collaborating with FP&A to facilitate Planning, Forecasting, and Reporting processes. - Creating and maintaining system documentation, both functional and technical. What You Love About Capgemini: - Shape your career with a range of career paths and internal opportunities. - Receive personalized career guidance from leaders. - Comprehensive wellness benefits including health checks, telemedicine, insurance, elder care, partner coverage, and new parent support via flexible work. - Opportunity to learn on one of the industry's largest digital learning platforms with access to 250,000+ courses and numerous certifications. About Capgemini: Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With over 55 years of heritage, Capgemini is trusted by clients worldwide to unlock the value of technology across their business needs. The company's 340,000 team members in more than 50 countries offer end-to-end services and solutions, leveraging strengths from strategy and design to engineering, fueled by market-leading capabilities in AI, generative AI, cloud, and data, combined with deep industry expertise and a strong partner ecosystem.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

The ideal candidate for this position should have a Bachelor's or Master's degree in Computer Science or Computer Engineering, or an equivalent field. You should possess at least 2-6 years of experience in server side development using technologies such as GoLang, Node.JS, or Python. You should demonstrate proficiency in working with AWS services like Lambda, DynamoDB, Step Functions, and S3. Additionally, you should have hands-on experience in deploying and managing Serverless service environments. Experience with Docker, containerization, and Kubernetes is also required for this role. A strong background in database technologies including MongoDB and DynamoDB is preferred. You should also have experience with CI/CD pipelines and automation processes. Any experience in Video Transcoding / Streaming on Cloud would be considered a plus. Problem-solving skills are essential for this role as you may encounter various challenges while working on projects.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

As a Senior Data Scientist with 5+ years of experience, you will play a crucial role in our team based in Indore/Pune. Your responsibilities will involve designing and implementing models, extracting insights from data, and interpreting complex data structures to facilitate business decision-making. You should have a strong background in Machine Learning areas such as Natural Language Processing, Machine Vision, Time Series, etc. Your expertise should extend to Model Tuning, Model Validation, Supervised and Unsupervised Learning. Additionally, hands-on experience with model development, data preparation, and deployment of models for training and inference is essential. Proficiency in descriptive and inferential statistics, hypothesis testing, and data analysis and exploration are key skills required for this role. You should be adept at developing code that enables reproducible data analysis. Familiarity with AWS services like Sagemaker, Lambda, Glue, Step Functions, and EC2 is expected. Knowledge of data science code development and deployment IDEs such as Databricks, Anaconda distribution, and similar tools is essential. You should also possess expertise in ML algorithms related to time-series, natural language processing, optimization, object detection, topic modeling, clustering, and regression analysis. Your skills should include proficiency in Hive/Impala, Spark, Python, Pandas, Keras, SKLearn, StatsModels, Tensorflow, and PyTorch. Experience with end-to-end model deployment and production for at least 1 year is required. Familiarity with Model Deployment in Azure ML platform, Anaconda Enterprise, or AWS Sagemaker is preferred. Basic knowledge of deep learning algorithms like MaskedCNN, YOLO, and visualization and analytics/reporting tools such as Power BI, Tableau, Alteryx would be advantageous for this role.,

Posted 1 week ago

Apply

6.0 - 11.0 years

20 - 30 Lacs

Bhopal, Hyderabad, Pune

Hybrid

Hello Greetings from NewVision Software!! We are hiring on an immediate basis for the role of Senior / Lead Python Developer + AWS | NewVision Software | Pune, Hyderabad & Bhopal Location | Fulltime Looking for professionals who can join us Immediately or within 15 days is preferred. Please find the job details and description below. NewVision Software PUNE HQ OFFICE 701 &702, Pentagon Tower, P1, Magarpatta City, Hadapsar, Pune, Maharashtra - 411028, India NewVision Software The Hive Corporate Capital, Financial District, Nanakaramguda, Telangana - 500032 NewVision Software IT Plaza, E-8, Bawadiya Kalan Main Rd, near Aura Mall, Gulmohar, Fortune Pride, Shahpura, Bhopal, Madhya Pradesh - 462039 Senior Python and AWS Developer Role Overview: We are looking for a skilled senior Python Developer with strong background in AWS cloud services to join our team. The ideal candidate will be responsible for designing, developing, and maintaining robust backend systems, ensuring high performance and responsiveness to requests from the front end. Responsibilities : Develop, test, and maintain scalable web applications using Python and Django. Design and manage relational databases with PostgreSQL, including schema design and optimization. Build RESTful APIs and integrate with third-party services as needed. Work with AWS services including EC2, EKS, ECR, S3, Glue, Step Functions, EventBridge Rules, Lambda, SQS, SNS, and RDS. Collaborate with front-end developers to deliver seamless end-to-end solutions. Write clean, efficient, and well-documented code following best practices. Implement security and data protection measures in applications. Optimize application performance and troubleshoot issues as they arise. Participate in code reviews, testing, and continuous integration processes. Stay current with the latest trends and advancements in Python, Django, and database technologies. Mentor junior python developers. Requirements : 6+ years of professional experience in Python development. Strong proficiency with Django web framework. Experience working with PostgreSQL, including complex queries and performance tuning. Familiarity with RESTful API design and integration. Strong understanding of OOP, SOLID principles, and design patterns. Strong knowledge of Python multithreading and multiprocessing. Experience with AWS services: S3, Glue, Step Functions, EventBridge Rules, Lambda, SQS, SNS, IAM, Secret Manager, KMS and RDS. Understanding of version control systems (Git). Knowledge of security best practices and application deployment. Basic understanding of Microservices architecture. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Nice to Have Experience with Docker, Kubernetes, or other containerization tools. Good to have front-end technologies (React). Experience with CI/CD pipelines and DevOps practices. Experience with infrastructure as code tools like Terraform. Education : Bachelors degree in computer science engineering or related field (or equivalent experience). Do share your resume with my email address: imran.basha@newvision-software.com Please share your experience details: Total Experience: Relevant Experience: Exp: Python: Yrs, AWS: Yrs, PostgreSQL: Yrs Rest API: Yrs, Django: Current CTC: Expected CTC: Notice / Serving (LWD): Any Offer in hand: LPA Current Location Preferred Location: Education: Please share your resume and the above details for Hiring Process: - imran.basha@newvision-software.com

Posted 1 week ago

Apply

5.0 - 12.0 years

0 - 0 Lacs

hyderabad, telangana

On-site

As a Senior Software Engineer with 5-8 years of experience, you will be responsible for developing efficient and scalable software solutions. Your primary focus will be on utilizing Core JAVA (8 or above), Springboot, RESTful APIs, and Microservices architecture to deliver high-quality applications. Additionally, you will be expected to work with Maven and possess strong AWS skills, particularly in services like S3 bucket, step functions, storage gateway, ECS, EC2, DynamoDB, AuroraDB, Lambda functions, and Glue. In this role, it is essential to have expertise in Code management using Git, setting up CI/CD Pipelines with tools like Jenkins/GitHub, and working with Docker/Kubernetes for containerization. Your knowledge of SQL/NoSQL databases such as PostgreSQL and MongoDB will be beneficial. Experience in Testing frameworks like Unit Testing (JUnit/Mockito), integration testing, mutation testing, and TDD is crucial. Proficiency in Kafka, graphQL/Supergraph, and using splunk/honeycomb dashboards will be advantageous. You will also be involved in interacting with APIs, ensuring security in AWS, managing Containers, and holding AWS certifications. The ideal candidate should have strong communication skills, be a team player, and possess a proactive attitude towards problem-solving. This position is based in Hyderabad and offers a competitive salary based on your experience level. If you are passionate about software engineering and have a solid background in the mentioned technologies, we encourage you to apply and be part of our dynamic team.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Platform Engineer Lead at Barclays, your role is crucial in building and maintaining systems that collect, store, process, and analyze data, including data pipelines, data warehouses, and data lakes. Your responsibility includes ensuring the accuracy, accessibility, and security of all data. To excel in this role, you should have hands-on coding experience in Java or Python and a strong understanding of AWS development, encompassing various services such as Lambda, Glue, Step Functions, IAM roles, and more. Proficiency in building efficient data pipelines using Apache Spark and AWS services is essential. You are expected to possess strong technical acumen, troubleshoot complex systems, and apply sound engineering principles to problem-solving. Continuous learning and staying updated with new technologies are key attributes for success in this role. Design experience in diverse projects where you have led the technical development is advantageous, especially in the Big Data/Data Warehouse domain within Financial services. Additional skills in enterprise-level software solutions development, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, and Kinesis are highly valued. Effective communication, collaboration with cross-functional teams, documentation skills, and experience in mentoring team members are also important aspects of this role. Your accountabilities will include the construction and maintenance of data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. You will also be expected to contribute to strategy, drive requirements for change, manage resources and policies, deliver continuous improvements, and demonstrate leadership behaviors if in a leadership role. Ultimately, as a Data Platform Engineer Lead at Barclays in Pune, you will play a pivotal role in ensuring data accuracy, accessibility, and security while leveraging your technical expertise and collaborative skills to drive innovation and excellence in data management.,

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities Strong experience in core Python in Application Development AWS-AWS Application development experience(Backend Services Development) AWS Services: API Gateway, Lambda Functions, Step Functions, EKS/ ECS/ EC2, S3, SQS/SNS/ EventBridge, RDS, Cloudwatch, ELB System Design-LLD(Low level Design document) using OOps, SOLID and other design principles. Gen AI- Nice to have( Knowledge of LLMs, RAG, Finetuning, LangChain , Prompt Engineering Experience in working with Stakeholders, (Business, Technology) for requirement refinement to create acceptance criteria HLD and LLD Hands on in designing (competent with designing tools like draw.io/ Lucid/ Visio etc) and developing End to End tasks Handson with alteast one RDBMS (Postgresql/ MYSQL/ Oracle DB ), Knowledge of ORM (SQLalchemy). Good to have: Mongo DB/ Dynamo DB. Excellent communication skills(Written & Oral)

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

As an AWS Data Engineer at Sufalam Technologies, located in Ahmedabad, India, you will be responsible for designing and implementing data engineering solutions on AWS. Your role will involve developing data models, managing ETL processes, and ensuring the efficient operation of data warehousing solutions. Collaboration with Finance, Data Science, and Product teams is crucial to understand reconciliation needs and ensure timely data delivery. Your expertise will contribute to data analytics activities supporting business decision-making and strategic goals. Key responsibilities include designing and implementing scalable and secure ETL/ELT pipelines for processing financial data. Collaborating closely with various teams to understand reconciliation needs and ensuring timely data delivery. Implementing monitoring and alerting for pipeline health and data quality, maintaining detailed documentation on data flows, models, and reconciliation logic, and ensuring compliance with financial data handling and audit standards. To excel in this role, you should have 5-6 years of experience in data engineering with a strong focus on AWS data services. Hands-on experience with AWS Glue, Lambda, S3, Redshift, Athena, Step Functions, Lake Formation, and IAM is essential for secure data governance. A solid understanding of data reconciliation processes in the finance domain, strong SQL skills, experience with data warehousing and data lakes, and proficiency in Python or PySpark for data transformation are required. Knowledge of financial accounting principles or experience working with financial datasets (AR, AP, General Ledger, etc.) would be beneficial.,

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Gurugram

Work from Office

Company Overview Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and design capabilities coupled with deep domain understanding. We combine services and products to maximize business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science & healthcare industries. Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep technology and domain experts, and global teams. Incedo University, our learning platform, provides ample learning opportunities starting with a structured onboarding program and carrying throughout various stages of your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible career paths allow you to grow into a program manager, a technical architect or a domain expert based on your skills and interests. Our Mission is to enable our clients to maximize business impact from technology by Harnessing the transformational impact of emerging technologies Bridging the gap between business and technology Role Description Write and maintain build/deploy scripts. Work with the Sr. Systems Administrator to deploy and implement new cloud infra structure and designs. Manage existing AWS deployments and infrastructure. Build scalable, secure, and cost-optimized AWS architecture. Ensure best practices are followed and implemented. Assist in deployment and operation of security tools and monitoring. Automate tasks where appropriate to enhance response times to issues and tickets. Collaborate with Cross-Functional Teams: Work closely with development, operations, and security teams to ensure a cohesive approach to infrastructure and application security. Participate in regular security reviews and planning sessions. Incident Response and Recovery: Participate in incident response planning and execution, including post-mortem analysis and preventive measures implementation. Continuous Improvement: Regularly review and update security practices and procedures to adapt to the evolving threat landscape. Analyze and remediate vulnerabilities and advise developers of vulnerabilities requiring updates to code. Create/Maintain documentation and diagrams for application/security and network configurations. Ensure systems are monitored using monitoring tools such as Datadog and issues are logged and reported to required parties. Technical Skills Experience with system administration, provisioning and managing cloud infrastructure and security monitoring In-depth. Experience with infrastructure/security monitoring and operation of a product or service. Experience with containerization and orchestration such as Docker, Kubernetes/EKS Hands on experience creating system architectures and leading architecture discussions at a team or multi-team level. Understand how to model system infrastructure in the cloud with Amazon Web Services (AWS), AWS CloudFormation, or Terraform. Strong knowledge of cloud infrastructure (AWS preferred) services like Lambda, Cognito, SQS, KMS, S3, Step Functions, Glue/Spark, CloudWatch, Secrets Manager, Simple Email Service, CloudFront Familiarity with coding, scripting and testing tools. (preferred) Strong interpersonal, coordination and multi-tasking skills Ability to function both independently and collaboratively as part of a team to achieve desired results. Aptitude to pick up new concepts and technology rapidly; ability to explain it to both business & tech stakeholders. Ability to adapt and succeed in a fast-paced, dynamic startup environment. Experience with Nessus and other related infosec tooling Nice-to-have skills Strong interpersonal, coordination and multi-tasking skills Ability to work independently and follow through to achieve desired results. Quick learner, with the ability to work calmly under pressure and with tight deadlines. Ability to adapt and succeed in a fast-paced, dynamic startup environment Qualifications BA/BS degree in Computer Science, Computer Engineering, or related field; MS degree in Computer Science or Computer Engineering ( preferred) Company Value

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As the Lead Data Engineer at Mastercard, you will be responsible for designing and building scalable, cloud-native data platforms using PySpark, Python, and modern data engineering practices. Your role will involve mentoring and guiding other engineers, fostering a culture of curiosity and continuous improvement, and creating robust ETL/ELT pipelines to serve business-critical use cases. You will lead by example by writing high-quality, testable code, participating in architecture and design discussions, and decomposing complex problems into scalable components aligned with platform and product goals. Championing best practices in data engineering, you will drive collaboration across teams, support data governance and quality efforts, and continuously learn and apply new technologies to improve team productivity and platform reliability. To succeed in this role, you should have at least 5 years of hands-on experience in data engineering with strong PySpark and Python skills. You should also possess solid experience in designing and implementing data models, pipelines, and batch/stream processing systems. Additionally, you should be comfortable working with cloud platforms such as AWS, Azure, or GCP and have a strong foundation in data modeling, database design, and performance optimization. A bachelor's degree in computer science, engineering, or a related field is required, along with experience in Agile/Scrum development environments. Experience with CI/CD practices, version control, and automated testing is essential, as well as the ability to mentor and uplift junior engineers. Familiarity with cloud-related services like S3, Glue, Data Factory, and Databricks is highly desirable. Furthermore, exposure to data governance tools and practices, orchestration tools, containerization, and infrastructure automation will be advantageous. A master's degree, relevant certifications, or contributions to open source/data engineering communities will be considered a bonus. Exposure to machine learning data pipelines or MLOps is also a plus. If you are a curious, adaptable, and driven individual who enjoys problem-solving and continuous improvement, and if you have a passion for building clean data pipelines and cloud-native designs, then this role is perfect for you. Join us at Mastercard and be part of a team that is dedicated to unlocking the potential of data assets and shaping the future of data engineering.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Who We Are:- We are a digitally native company that helps organizations reinvent themselves and unleash their potential. We are the place where innovation, design and engineering meet scale. Globant is 20 years old, NYSE listed public organization with more than 33,000+ employees worldwide working out of 35 countries globally. www.globant.com Job location: Pune/Hyderabad/Bangalore Work Mode: Hybrid Experience: 5 to 10 Years Must have skills are 1) AWS (EC2 & EMR & EKS) 2) RedShift 3) Lambda Functions 4) Glue 5) Python 6) Pyspark 7) SQL 8) Cloud watch 9) No SQL Database - DynamoDB/MongoDB/ OR any We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have a strong background in designing, developing, and managing data pipelines, working with cloud technologies, and optimizing data workflows. You will play a key role in supporting our data-driven initiatives and ensuring the seamless integration and analysis of large datasets. Design Scalable Data Models: Develop and maintain conceptual, logical, and physical data models for structured and semi-structured data in AWS environments. Optimize Data Pipelines: Work closely with data engineers to align data models with AWS-native data pipeline design and ETL best practices. AWS Cloud Data Services: Design and implement data solutions leveraging AWS Redshift, Athena, Glue, S3, Lake Formation, and AWS-native ETL workflows. Design, develop, and maintain scalable data pipelines and ETL processes using AWS services (Glue, Lambda, RedShift). Write efficient, reusable, and maintainable Python and PySpark scripts for data processing and transformation. Optimize SQL queries for performance and scalability. Expertise in writing complex SQL queries and optimizing them for performance. Monitor, troubleshoot, and improve data pipelines for reliability and performance. Focusing on ETL automation using Python and PySpark, responsible for design, build, and maintain efficient data pipelines, ensuring data quality and integrity for various applications.

Posted 2 weeks ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies