Home
Jobs

1578 Data Processing Jobs - Page 45

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Bhubaneswar. Your typical day will involve creating innovative solutions to address specific business needs and collaborating with cross-functional teams to ensure successful project delivery. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the development and implementation of new software applications Conduct code reviews and ensure adherence to coding standards Troubleshoot and resolve complex technical issues Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform Strong understanding of data analytics and data processing Experience in developing and deploying applications using Databricks platform Knowledge of cloud computing and data storage solutions Hands-on experience with data modeling and database design Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform This position is based at our Bhubaneswar office A 15 years full-time education is required Qualifications 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Gurugram

Work from Office

Naukri logo

Entity :- Accenture Strategy & Consulting Job location :- Gurgaon, Bangalore, Pune & Mumbai About S&C - Global Network :- Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition. WHAT'S IN IT FOR YOU? Accenture CFO & EV team under Data & AI team has comprehensive suite of capabilities in Risk, Fraud, Financial crime, and Finance. Within risk realm, our focus revolves around the model development, model validation, and auditing of models. Additionally, our work extends to ongoing performance evaluation, vigilant monitoring, meticulous governance, and thorough documentation of models. Get to work with top financial clients globally Access resources enabling you to utilize cutting-edge technologies, fostering innovation with the world's most recognizable companies. Accenture will continually invest in your learning and growth and will support you in expanding your knowledge. You'll be part of a diverse and vibrant team collaborating with talented individuals from various backgrounds and disciplines continually pushing the boundaries of business capabilities, fostering an environment of innovation. What you would do in this role The ideal candidate shall, Help the team in designing, developing, validating, monitoring, and deploying advanced Financial Crime models for different client problems. Interface with clients/account team to understand engineering/business problems and translate it into analytics problems that shall deliver insights for action and operational improvements. Have ability to consume data from multiple sources and present relevant information in a crisp and digestible manner that delivers valuable insights. Have strong communication and presentation skills to effectively convey complex data insights and recommendations to clients and stakeholders. Qualifications Who we are looking for? Have Master’s degree in mathematics, statistics, economics, or related field or MBA from top-tier universities with a strong record of achievement, solid analytical ability, and an entrepreneurial hands-on approach to work. 3 - 5 years of relevant Financial Crime Analytics experience at one or more Financial Services firms, or Professional Services / Risk Advisory with significant exposure to one or more of the following areas: Anti Money Laundering, Financial Crime, Transaction Monitoring, Customer KYC/CDD, Customer screening, etc. Advanced skills in development, validation and monitoring of AML analytics models, strategies, visualizations. Understanding of evolving methodologies/ tools/ technologies in the Financial Crime management space. Expertise in one or more domain/industry including regulations, frameworks etc. Experience in building models using AI/ML methodologies. Modeling:Experience in one or more of analytical tools such as SAS, R, Python, SQL, etc. Knowledge of data processes, ETL and tools/ vendor products such as Fenergo, NICE Actimize, SAS AML, Quantexa, Ripjar, etc. Key Responsibilities: Engagement Execution: Work independently/with minimal supervision in client engagements that may involve model development, validation, governance, strategy, transformation, implementation and end-to-end delivery of financial crime analytics/management solutions for Accenture’s clients. Advise clients on a wide range of Financial Crime Management/ Analytics initiatives. Projects may involve Financial Crime Management advisory work for CXOs, etc. to achieve a variety of business and operational outcomes. Develop and frame Proof of Concept for key clients, where applicable. Practice Enablement: Guide junior team members. Support development of the Practice by driving innovations, initiatives. Develop thought capital and disseminate information around current and emerging trends in Financial Crime Analytics and Management. Travel:Willingness to travel up to 40% of the time. Accenture is an equal opportunities employer and welcomes applications from all sections of society and does not discriminate on grounds of race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, or any other basis as protected by applicable law.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Gurugram

Work from Office

Naukri logo

About The Role : Entity :- Accenture Strategy & Consulting Title :- Level 9 Ind & Func AI Decision Science Consultant Job location :- Bengaluru, Gurugram, Mumbai About S&C - Global Network :- Accenture Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition WHAT'S IN IT FOR YOU? Accenture CFO & EV team under Data & AI team has comprehensive suite of capabilities in Risk, Fraud, Financial crime, and Finance. Within risk realm, our focus revolves around the model development, model validation, and auditing of models. Additionally, our work extends to ongoing performance evaluation, vigilant monitoring, meticulous governance, and thorough documentation of models. Get to work with top financial clients globally Access resources enabling you to utilize cutting-edge technologies, fostering innovation with the world's most recognizable companies. Accenture will continually invest in your learning and growth and will support you in expanding your knowledge. You'll be part of a diverse and vibrant team collaborating with talented individuals from various backgrounds and disciplines continually pushing the boundaries of business capabilities, fostering an environment of innovation. What you would do in this role Engagement Execution Work independently/with minimal supervision in client engagements that may involve model development, validation, governance, strategy, transformation, implementation and end-to-end delivery of fraud analytics/management solutions for Accenture's clients. Advise clients on a wide range of Fraud Management/ Analytics initiatives. Projects may involve Fraud Management advisory work for CXOs, etc. to achieve a variety of business and operational outcomes. Develop and frame Proof of Concept for key clients, where applicable Practice Enablement Guide junior team members. Support development of the Practice by driving innovations, initiatives. Develop thought capital and disseminate information around current and emerging trends in Fraud Analytics and Management Support efforts of sales team to identify and win potential opportunities by assisting with RFPs, RFI. Assist in designing POVs, GTM collateral. Travel:Willingness to travel up to 40% of the time Professional Development Skills:Project Dependent Qualifications Job Qualifications Who we are looking for? 3-5 years of relevant Fraud Analytics experience at one or more Financial Services firms, or Professional Services / Risk Advisory with significant exposure to one or more of the following areas: o Banking Fraud, Payment Fraud, Credit Card Fraud, Retail o Fraud, Anti Money Laundering, Financial Crime, Telecom o Fraud, Energy Fraud, Insurance Claims Fraud etc. Advanced skills in development and validation of fraud analytics models, strategies, visualizations. Understanding of new/ evolving methodologies/tools/technologies in the Fraud management space. Expertise in one or more domain/industry including regulations, frameworks etc. Experience in building models using AI/ML methodologies Modeling:Experience in one or more of analytical tools such as SAS, R, Python, SQL, etc. Knowledge of data processes, ETL and tools/ vendor products such as VISA AA, FICO Falcon, EWS, RSA, IBM Trusteer, SAS AML, Quantexa, Ripjar, Actimize etc. Strong conceptual knowledge and practical experience in the Development, Validation and Deployment of ML/AL models Hands-on programming experience with any of the analytics tools and visualization tools (Python, R, PySpark, SAS, SQL, PowerBI/ Tableau) Knowledge of big data, ML ops and cloud platforms (Azure/GCP/AWS) Strong written and oral communication skills Project management skills and the ability to manage multiple tasks concurrently Strong delivery experience of short and long term analytics projects

Posted 3 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : No Function Specialty Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable solutions using Google BigQuery. Your typical day will involve collaborating with the team, analyzing business requirements, designing and implementing application features, and ensuring the applications meet quality standards and performance goals. Roles & Responsibilities:1. Design, create, code, and support a variety of data pipelines and models on GCP cloud technology 2. Strong hand-on exposure to GCP services like BigQuery, Composer etc.3. Partner with business/data analysts, architects, and other key project stakeholders to deliver data requirements.4. Developing data integration and ETL (Extract, Transform, Load) processes.5. Support existing Data warehouses & related pipelines.6. Ensuring data quality, security, and compliance.7. Optimizing data processing and storage efficiency, troubleshoot issues in Data space.8. Seeks to learn new skills/tools utilized in Data space (ex:dbt, MonteCarlo etc.)9. Excellent communication skills- verbal and written, Excellent analytical skills with Agile mindset.10. Demonstrates strong affinity towards paying attention to details and delivery accuracy.11. Self-motivated team player and should have ability to overcome challenges and achieve desired results.12. Work effectively in Global distributed environment. Professional & Technical Skills:Skill Proficiency Expectation:Expert:Data Storage, BigQuery,SQL,Composer,Data Warehousing ConceptsIntermidate Level:PythonBasic Level/Preferred:DB,Kafka, Pub/Sub Must To Have Skills:Proficiency in Google BigQuery. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. This position is based at our Hyderabad office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Bhubaneswar. Your typical day will involve creating innovative solutions to address specific business needs and collaborating with cross-functional teams to ensure successful project delivery. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the development and implementation of new software applications Conduct code reviews and ensure adherence to coding standards Troubleshoot and resolve complex technical issues Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform Strong understanding of data analytics and data processing Experience in developing and deploying applications using Databricks platform Knowledge of cloud computing and data storage solutions Hands-on experience with data modeling and database design Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform This position is based at our Bhubaneswar office A 15 years full-time education is required Qualifications 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Mumbai. You will play a crucial role in developing innovative solutions to drive business success. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application development process Implement best practices for application design and development Conduct code reviews and ensure code quality Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform Strong understanding of data analytics and data processing Experience in building and configuring applications Knowledge of cloud computing platforms Hands-on experience with data integration and transformation Additional Information: The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform This position is based at our Mumbai office A 15 years full-time education is required Qualifications 15 years full time education

Posted 3 weeks ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Karnataka

Work from Office

Naukri logo

Focus on designing, developing, and maintaining Snowflake data environments. Responsible for data modeling, ETL pipelines, and query optimization to ensure efficient and secure data processing.

Posted 3 weeks ago

Apply

5.0 years

7 Lacs

Hyderabad

Work from Office

Naukri logo

Design, implement, and optimize Big Data solutions using Hadoop and Scala. You will manage data processing pipelines, ensure data integrity, and perform data analysis. Expertise in Hadoop ecosystem, Scala programming, and data modeling is essential for this role.

Posted 3 weeks ago

Apply

0.0 years

1 - 4 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Enter information into database systems accurately and efficiently. Manage and maintain accurate records and reports. Meet productivity and quality standards. Basic computer skills and knowledge of MS Office require

Posted 3 weeks ago

Apply

0.0 years

1 - 5 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Enter information into database systems accurately and efficiently. Manage and maintain accurate records and reports. Meet productivity and quality standards. Basic computer skills and knowledge of MS Office require

Posted 3 weeks ago

Apply

0.0 years

1 - 4 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Enter information into database systems accurately and efficiently. Manage and maintain accurate records and reports. Meet productivity and quality standards. Basic computer skills and knowledge of MS Office require

Posted 3 weeks ago

Apply

2.0 - 7.0 years

20 - 25 Lacs

Mumbai, Navi Mumbai

Work from Office

Naukri logo

The Role - Senior Survey Scripter You will be part of an operations team, which will provide operational support to research teams in Europe, the USA, the Middle East and APAC, enabling them to offer quality consultancy on a wide variety of topics. Alongside other survey scriptwriters and data processing execs, you will be part of an Operations team which strives to produce accurate results every time and supports the rest of the company in delivering innovative and robust research solutions. What will I be delivering? To provide expert scripting services for researchers to aid them in their client relationships To take complex questionnaires from researchers and script them using YouGov s bespoke scripting software in an accurate and timely manner. Testing survey logic to ensure that it is error free. Using experience to liaise with internal clients, advising on best practice and assisting with problem solving Management of own workload to ensure that deadlines are met and standards are achieved. To assist the senior scripters in implementing any new solutions for improved efficiency within the workflow To maintain excellent record administration so as to have an accurate log of work carried out as part of the service To ensure all reporting and management requests are accurate and delivered on time What do I need to bring with me? Fluent in English Familiarity with any computer language or web-design coding (e.g. HTML, CSS, JavaScript) Degree in an IT-based subject, or evidence of a similar level of computer skills Teamwork Strong logical problem solving skills Excellent attention to detail Good communication skills, especially in the area of explaining technical points to non-technical people Ability to work independently and manage own deadlines High level of proficiency with MS Office, especially Excel

Posted 3 weeks ago

Apply

7.0 - 9.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Visa is seeking a Senior Data Engineer in the Data Platform department to act as one of key technology leaders to build and manage Visa s technology assets in the Platform as a Service organization. As a Staff Data Engineer, you will work on Open source Platforms Enhancements. You will have the opportunity to lead, participate, guide, and mentor other engineers in the team on design and development. This position will be based in Bangalore, KA and reporting to Director of Engineering. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. 7 or more years of work experience with a Bachelors Degree or an Advanced Degree (e.g. Masters, MBA, JD, MD) or up to 3 years of relevant experience with a PhD Bachelors degree in Computer Science, or related technical discipline. With 4+ years

Posted 3 weeks ago

Apply

3.0 - 8.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC & Summary Experience 3 8 years in Data Engineering We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Azure, Databricks, and GCP. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Key Responsibilities Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS, Azure, Databricks, and GCP. Implement data ingestion and transformation processes to facilitate efficient data warehousing. Utilize cloud services to enhance data processing capabilities AWS Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. Azure Data Factory, Synapse Analytics, Functions, Cosmos DB, Event Grid, Logic Apps, Service Bus. GCP Dataflow, BigQuery , DataProc , Cloud Functions, Bigtable, Pub/Sub, Data Fusion. Optimize Spark job performance to ensure high efficiency and reliability. Stay proactive in learning and implementing new technologies to improve data processing frameworks. Collaborate with crossfunctional teams to deliver robust data solutions. Work on Spark Streaming for realtime data processing as necessary. Qualifications 3 8 years of experience in data engineering with a strong focus on cloud environments. Proficiency in PySpark or Spark is mandatory. Proven experience with data ingestion, transformation, and data warehousing. Indepth knowledge and handson experience with cloud services( AWS/Azure/GCP) Demonstrated ability in performance optimization of Spark jobs. Strong problemsolving skills and the ability to work independently as well as in a team. Cloud Certification (AWS, Azure, or GCP) is a plus. Familiarity with Spark Streaming is a bonus. Mandatory skill sets Python, Pyspark , SQL with (AWS or Azure or GCP) Preferred skill sets Python, Pyspark , SQL with (AWS or Azure or GCP) Years of experience required 3 8 years Education qualification BE/BTECH, ME/MTECH, MBA, MCA Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering, Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Node.js Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 28 more} No

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Skills Exp Required Databricks 5+ Yrs Py spark 5+ Yrs Spark 5+ Yrs SQL 5+ Yrs AWS 2+ Yrs ECMS Req # 527070 Relevant and Total years of experience Relevant 5+ years and Total 6 + years Detailed job description - Skill Set: Strong Data Engineer with 5+ Years exp in Data Bricks , Pyspark , SQL Design and develop data processing pipelines and analytics solutions using Databricks. Architect scalable and efficient data models and storage solutions on the Databricks platform. Collaborate with architects and other teams to migrate current solution to use Databricks. Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements. Use best practices for data governance, security, and compliance on the Databricks platform. Mentor junior engineers and provide technical guidance. Stay current with emerging technologies and trends in data engineering and analytics to drive continuous improvement. Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. Proven track record of delivering scalable and reliable data solutions in a fast-paced environment. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills with the ability to work effectively in cross-functional teams. Good to have experience with containerization technologies such as Docker and Kubernetes. Knowledge of DevOps practices for automated deployment and monitoring of data pipelines. Mandatory Skills Strong Data Engineer with 5+ Years exp in Data Bricks , Pyspark , SQL Design and develop data processing pipelines and analytics solutions using Databricks. Architect scalable and efficient data models and storage solutions on the Databricks platform. Collaborate with architects and other teams to migrate current solution to use Databricks. Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements. Use best practices for data governance, security, and compliance on the Databricks platform. Mentor junior engineers and provide technical guidance. Stay current with emerging technologies and trends in data engineering and analytics to drive continuous improvement. Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. Vendor Billing range in local currency (per day) INR 8000/Day Work Location Bangalore, Hyderabad (Preferred) Notice period 15 days WFO/WFH/Hybrid WFO Hybrid BGCHECK before or After onboarding Post onboarding Skill Matrix: Skills Exp Required Databricks 5+ Yrs Py spark 5+ Yrs Spark 5+ Yrs SQL 5+ Yrs AWS 2+ Yrs

Posted 3 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

We are M&G Global Services Private Limited (formerly known as 10FA India Private Limited, and prior to that Prudential Global Services Private Limited) . We are a fully owned subsidiary of the M&G plc group of companies, operating as a Global Capability Centre providing a range of value adding services to the Group since 2003. At M&G our purpose is to give everyone real confidence to put their money to work. As an international savings and investments business with roots stretching back more than 170 years, we offer a range of financial products and services through Asset Management, Life and Wealth. All three operating segments work together to deliver attractive financial outcomes for our clients, and superior shareholder returns. We are seeking a skilled and detail-oriented Data Analyst / Power BI Analyst to join our dynamic team. The successful candidate will be responsible for analysing and presenting complex data sets, creating insightful reports and performance dashboards using Power BI, along with providing data-driven recommendations to support business decisions. They will also be responsible for generating the performance teams management information on a monthly basis and implementing the use of AI and automation within the team. Key Responsibilities: Data Analysis: Collect, clean, and preprocess large datasets from various sources. Analyse data from across the Performance landscape to identify trends, patterns, and anomalies, so it can be used to provide valuable insights in order to drive better client outcomes. Perform statistical analysis to support business strategy and operations. Power BI Reporting: Develop, maintain, and improve Power BI reports and dashboards within the Performance team Use DAX (Data Analysis Expressions) to manipulate data within Power BI. Ensure data visualisations are both informative and visually appealing. Providing simple concise meaningful insights Business Intelligence: Collaborate with stakeholders across the business to understand their performance data needs and requirements. Translate business needs into technical specifications for reporting and analysis solutions. Provide actionable insights and recommendations based on analysis. Database Management: To understand and aid the data platform teams with performance data storage and delivery solutions. Ensure data integrity and security within databases. Create and maintain documentation of data processes and models. Continuous Improvement: Implement AI usage and drive automation where appropriate within the performance team, driving efficiency and lower risks Stay updated with the latest trends and technologies in data analysis and business intelligence. Identify opportunities for process improvements and implement best practices in data presentation. Controls: Supporting the implementation of controls and procedures and documenting the same. Supporting the implementation of controls within M&G s central Investment Data Platform. Ensuring data requirements are fully supported within all required systems. Qualifications: Bachelor s degree in data science, Computer Science, Statistics, Business Analytics, or a related field. Masters degree is a plus. Proven experience as a Data Analyst or Business Intelligence Analyst. Proficiency in Microsoft Power BI with demonstrable experience in creating complex dashboards and reports. Strong proficiency in SQL for querying databases. Experience with data processing and analysis tools such as Excel, Python, or R. Knowledge of data warehousing concepts and ETL processes. Excellent analytical and problem-solving skills. Strong communication skills to present findings and recommendations effectively. Skills: Technical Skills: Advanced knowledge of Power BI, DAX, and Power Query. Proficiency in SQL and data manipulation. Experience with data analysis tools (Excel, Python, R). Analytical Skills: Ability to analyse and interpret complex data sets. Strong statistical analysis skills. Ability to draw meaningful insights and actionable recommendations from data. Communication Skills: Excellent verbal and written communication skills. Ability to explain technical concepts to non-technical stakeholders. Organisational Skills: Strong attention to detail and accuracy. Ability to manage multiple tasks and projects simultaneously. Effective time-management skills. We have a diverse workforce and an inclusive culture at M&G Global Services, regardless of gender, ethnicity, age, sexual orientation, nationality, disability or long term condition, we are looking to attract, promote and retain exceptional people. We also welcome those who take part in military service and those returning from career breaks.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Who we are About Stripe About the team The Batch Compute team at Stripe manages the infrastructure, tooling and systems behind running batch processing systems at Stripe, which are currently powered by Hadoop and Spark. Batch processing systems power several core asynchronous workflows at Stripe and operate at significant scale. Were looking for a Software Engineer with experience designing, building and maintaining high-scale, distributed systems. You will work with a team that is in charge of the core infrastructure used by the product teams to build and operate batch processing jobs. You will have an opportunity to play a hands-on role in significantly rearchitecting our current infrastructure to be much more efficient and resilient. This re-architecture will introduce disaggregation of Hadoop storage and compute with open source solutions. Responsibilities Scope and lead technical projects within the Batch Compute domain Build and maintain the infrastructure which powers the core of Stripe. Directly contribute to core systems and write code. Work closely with the open source community to identify opportunities for adopting new open source features as well contribute back to the OSS. Ensure operational excellence and enable a highly available, reliable and secure Batch Compute platform Who you are We re looking for someone who meets the minimum requirements to be considered for the role. If you meet these requirements, you are encouraged to apply. The preferred qualifications are a bonus, not a requirement. Minimum requirements 8+ years of professional experience writing high quality production level code or software programs. Have experience with distributed data systems such as Spark, Flink, Trino, Kafka ,etc Experience developing, maintaining and debugging distributed systems built with open source tools. Experience building infrastructure as a product centered around user needs. Experience optimizing the end to end performance of distributed systems. Experience with scaling distributed systems in a rapidly moving environment. Preferred qualifications Experience as a user of batch processing systems (Hadoop, Spark) Track record of open source contributions to data processing or big data systems (Hadoop, Spark, Celeborn, Flink, etc) Office-assigned Stripes in most of our locations are currently expected to spend at least 50% of the time in a given month in their local office or with users. This expectation may vary depending on role, team and location. For example, Stripes in Stripe Delivery Center roles in Mexico City, Mexico and Bengaluru, India work 100% from the office. Also, some teams have greater in-office attendance requirements, to appropriately support our users and workflows, which the hiring manager will discuss. This approach helps strike a balance between bringing people together for in-person collaboration and learning from each other, while supporting flexibility when possible. Team Data Platform Job type Full time

Posted 3 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities:1.Lead the design and development of Python based applications and services2.Architect and implement cloud-native solutions using AWS services 3.Mentor and guide the Python development team, promoting best practices and code quality4.Collaborate with data scientists and analysts to implement data processing pipelines5.Participate in architecture discussions and contribute to technical decision-making 6.Ensure the scalability, reliability, and performance of Python applications on AWS 7.Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1.Python Programming.2.Web framework expertise (Django, Flask, or FastAPI) 3.Data processing and analysis 4.Database technologies (SQL and NoSQL) 5.API development 6.Significant experience working with AWS Lambda 7.AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus.8.Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9.Test-Driven Development (TDD) 10.DevOps practices 11.Agile methodologies.12.Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena).13.Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information:1.The candidate should have a minimum of 5 years of experience in Python Programming.2.This position is based at our Hyderabad office3.A 15 years full time education is required (Bachelor of computer science, or any related stream. masters degree preferred.) Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities:1.Lead the design and development of Python based applications and services2.Architect and implement cloud-native solutions using AWS services 3.Mentor and guide the Python development team, promoting best practices and code quality4.Collaborate with data scientists and analysts to implement data processing pipelines5.Participate in architecture discussions and contribute to technical decision-making 6.Ensure the scalability, reliability, and performance of Python applications on AWS 7.Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1.Python Programming 2.Web framework expertise (Django, Flask, or FastAPI) 3.Data processing and analysis 4.Database technologies (SQL and NoSQL) 5.API development 6.Significant experience working with AWS Lambda 7.AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus.8.Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9.Test-Driven Development (TDD) 10.DevOps practices 11.Agile methodologies.12.Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena).13.Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information:1.The candidate should have a minimum of 3 years of experience in Python Programming.2.This position is based at our Hyderabad office3.A 15 years full time education is required (Bachelor of computer science, or any related stream. masters degree preferred.) Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in the development and implementation of software solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to define, design, and ship new features.- Develop high-quality software design and architecture.- Identify, prioritize, and execute tasks in the software development life cycle.- Conduct software analysis, programming, testing, and debugging.- Create technical documentation for reference and reporting. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL concepts and data integration.- Experience in developing and implementing data processing solutions.- Knowledge of data warehousing concepts and methodologies.- Hands-on experience with data modeling and database design. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with programming languages such as Java or Scala.- Knowledge of cloud platforms and services for application deployment. Additional Information:- The candidate should have minimum 3 years of experience in Apache Spark.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Talend Big Data Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend Big Data.- Strong understanding of data integration processes and ETL methodologies.- Experience with big data technologies such as Hadoop and Spark.- Familiarity with cloud platforms and services related to data processing.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Talend Big Data.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

0.0 years

1 - 3 Lacs

Ahmedabad

Work from Office

Naukri logo

Ready to shape the future of work? At Genpact, we don't just adapt to change we drive it. AI and digital innovation are redefining industries and were leading the charge. Genpacts AI Gigafactory, our industry-first accelerator, is an example of how were scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team thats shaping the future, this is your moment Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook. Mega Virtual Drive for Customer Service roles -English+ Hindi Language on 30th May 2025 (Monday) || Ahmedabad Location Date: 30-May-2025 MS Teams meeting ID: 435 048 768 884 3 MS Teams Passcode: rm3mg3pU Time: 12:00 PM - 1:00 PM Job Location: Ahmedabad (Work from office) Languages Known: Hindi+English Shifts: Flexible with any shift Responsibilities • Respond to customer queries and customer's concern • Provide support for data collection to enable Recovery of the account for end user. • Maintain a deep understanding of client process and policies • Reproduce customer issues and escalate product bugs • Provide excellent customer service to our customers • You should be responsible to exhibit capacity for critical thinking and analysis. • Responsible to showcase proven work ethic, with the ability to work well both independently and within the context of a larger collaborative environment Qualifications we seek in you Minimum qualifications • Graduate (Any Discipline except law) • Only Freshers are eligible • Fluency in English & Hindi language is mandatory Preferred qualifications • Effective probing skills and analyzing / understanding skills • Analytical skills with customer centric approach • Excellent proficiency with written English and with neutral English accent • You should be able to work on a flexible schedule (including weekend shift) **Note: Please keep your E-Aadhar card handy while appearing for interview. Why join Genpact? Be a transformation leader Work at the cutting edge of AI, automation, and digital innovation Make an impact Drive change for global enterprises and solve business challenges that matter Accelerate your career Get hands-on experience, mentorship, and continuous learning opportunities Work with the best Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Lets build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Software Engineering Lead Analyst - API Developer Position Overview Developer with hands on design and developer experience to build robust APIs and services using Java and Spring Boot, coupled with hands-on experience in data processing. Has knowledge and experience to design and implement scalable On Prem / Cloud solutions that efficientlymanage and leverage large datasets. Proficient in Java / Spring Boot with demonstrated ability to integrate with different databases and other APIs and services while ensuring security and best practices are followed throughout the development lifecycle. Responsibilities Design, develop, and maintain API’s using Java and Spring Boot and ensure efficient data exchange between applications. Implement API security measures including authentication, authorization, and rate limiting. Document API specifications and maintain API documentation for internal and external users. Develop integration with different data sources and other APIs / Web Services Develop integrations with IBM MQ and Kafka Develop / Maintain CI/CD pipelines Do performance evaluation and application tuning Monitor and troubleshoot application for stability and performance Qualifications Required Skills & Experience: 5 - 8 Years of experience Programming LanguagesProficiency in Java. Web DevelopmentExperience with SOAP and RESTful services. Database ManagementStrong knowledge of SQL (Oracle). Version ControlExpertise in using version control systems like Git. CI/CDFamiliarity with CI/CD tools such as GitLab CI and Jenkins. Containerization & OrchestrationExperience with Docker and OpenShift. Messaging QueuesKnowledge of IBM MQ and Apache Kafka. Cloud ServicesFamiliarity with cloud platforms such as AWS, Azure, or Google Cloud. Desired Skills Analytical ThinkingAbility to break down complex problems and devise efficient solutions. DebuggingSkilled in identifying and fixing bugs in code and systems. Algorithm DesignProficiency in designing and optimizing algorithms. LeadershipProven leadership skills with experience mentoring junior engineers. CommunicationStrong verbal and written communication skills. TeamworkAbility to collaborate effectively with cross-functional teams. Time ManagementCompetence in managing time and meeting project deadlines. Education Bachelor’s degree in Computer Science, Software Engineering, or related field. A Masters degree is a plus. CertificationsRelevant certifications in AWS a plus Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH) About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies