Jobs
Interviews

8325 Pyspark Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Senior Google Cloud Platform (GCP) Data Engineer Location: Hybrid (Bengaluru, India) Job Type: Full-Time Experience Required: Minimum 6 Years Joining: Immediate or within 1 week About the Company: Tech T7 Innovations is a global IT solutions provider known for delivering cutting-edge technology services to enterprises across various domains. With a team of seasoned professionals, we specialize in software development, cloud computing, data engineering, machine learning, and cybersecurity. Our focus is on leveraging the latest technologies and best practices to create scalable, reliable, and secure solutions for our clients. Job Summary: We are seeking a highly skilled Senior GCP Data Engineer with over 6 years of experience in data engineering and extensive hands-on expertise in Google Cloud Platform (GCP). The ideal candidate must have a strong foundation in GCS, BigQuery, Apache Airflow/Composer, and Python, with a demonstrated ability to design and implement robust, scalable data pipelines in a cloud environment. Roles and Responsibilities: Design, develop, and deploy scalable and secure data pipelines using Google Cloud Platform components including GCS, BigQuery, and Airflow. Develop and manage robust ETL/ELT workflows using Python and integrate with orchestration tools such as Apache Airflow or Cloud Composer. Collaborate with data scientists, analysts, and business stakeholders to gather requirements and deliver reliable and efficient data solutions. Optimize BigQuery performance using best practices such as partitioning, clustering, schema design , and query tuning . Manage, monitor, and maintain data lake and data warehouse environments with high availability and integrity. Automate pipeline monitoring, error handling, and alerting mechanisms to ensure seamless and reliable data delivery . Contribute to architecture decisions involving data modeling, data flow, and integration strategies in a cloud-native environment. Ensure compliance with data governance , privacy, and security policies as per enterprise and regulatory standards. Mentor junior engineers and drive best practices in cloud engineering and data operations . Mandatory Skills: Google Cloud Platform (GCP): In-depth hands-on experience with GCS, BigQuery, IAM, and Cloud Functions. BigQuery (BQ): Expertise in large-scale analytics, schema optimization, and data modeling. Google Cloud Storage (GCS): Strong understanding of data lifecycle management, access controls, and best practices. Apache Airflow / Cloud Composer: Proficiency in writing and managing complex DAGs for data orchestration. Python Programming: Advanced skills in automation, API integration, and data processing using libraries like Pandas, PySpark, etc. Preferred Qualifications: Experience with CI/CD pipelines for data infrastructure and workflows. Exposure to other GCP services like Dataflow , Pub/Sub , and Cloud Functions . Familiarity with Infrastructure as Code (IaC) tools such as Terraform . Strong communication and analytical skills for problem-solving and stakeholder engagement. GCP Certifications (e.g., Professional Data Engineer) will be a significant advantage

Posted 15 hours ago

Apply

7.0 years

0 Lacs

Gurgaon Rural, Haryana, India

On-site

Minimum of 7+ years of experience in the data analytics field. Proven experience with Azure/AWS Databricks in building and optimizing data pipelines, architectures, and datasets. Strong expertise in Scala or Python, PySpark, and SQL for data engineering tasks. Ability to troubleshoot and optimize complex queries on the Spark platform. Knowledge of structured and unstructured data design, modelling, access, and storage techniques. Experience designing and deploying data applications on cloud platforms such as Azure or AWS. Hands-on experience in performance tuning and optimizing code running in Databricks environments. Strong analytical and problem-solving skills, particularly within Big Data environments. Experience with Big Data management tools and technologies including Cloudera, Python, Hive, Scala, Data Warehouse, Data Lake, AWS, Azure. Technical and Professional Skills: Must Have: Excellent communication skills with the ability to interact directly with customers. Azure/AWS Databricks. Python / Scala / Spark / PySpark. Strong SQL and RDBMS expertise. HIVE / HBase / Impala / Parquet. Sqoop, Kafka, Flume. Airflow.

Posted 16 hours ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About the Role: This position requires someone to work on complex technical projects and closely work with peers in an innovative and fast-paced environment. For this role, we require someone with a strong product design sense & specialized in Hadoop and Spark technologies. Requirements: Minimum 6-8 years of experience in Big Data technologies. The position Grow our analytics capabilities with faster, more reliable tools, handling petabytes of data every day. Brainstorm and create new platforms that can help in our quest to make available to cluster users in all shapes and forms, with low latency and horizontal scalability. Make changes to our diagnosing any problems across the entire technical stack. Design and develop a real-time events pipeline for Data ingestion for real-time dash- boarding. Develop complex and efficient functions to transform raw data sources into powerful, reliable components of our data lake. Design & implement new components and various emerging technologies in Hadoop Eco- System, and successful execution of various projects. Be a brand ambassador for Paytm – Stay Hungry, Stay Humble, Stay Relevant! Preferred Qualification : Bachelor's/Master's Degree in Computer Science or equivalent Skills that will help you succeed in this role: Fluent with Strong hands-on experience with Hadoop, MapReduce, Hive, Spark, PySpark etc. Excellent programming/debugging skills in Python/Java/Scala. Experience with any scripting language such as Python, Bash etc. Good to have experience of working with noSQL databases like Hbase, Cassandra. Hands-on programming experience with multithreaded applications. Good to have experience in Database, SQL, messaging queues like Kafka. Good to have experience in developing streaming applications e.g. Spark Streaming, Flink, Storm, etc. Good to have experience with AWS and cloud technologies such as S3Experience with caching architectures like Redis etc. Why join us: Because you get an opportunity to make a difference, and have a great time doing that. You are challenged and encouraged here to do stuff that is meaningful for you and for those we serve. You should work with us if you think seriously about what technology can do for people. We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be. To know more about exiting work we do:https://paytm.com/blog/engineering/ Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story!

Posted 17 hours ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Data Engineering – Technical Lead About Us: Paytm is India’s leading digital payments and financial services company, which is focused on driving consumers and merchants to its platform by offering them a variety of payment use cases. Paytm provides consumers with services like utility payments and money transfers, while empowering them to pay via Paytm Payment Instruments (PPI) like Paytm Wallet, Paytm UPI, Paytm Payments Bank Netbanking, Paytm FASTag and Paytm Postpaid - Buy Now, Pay Later. To merchants, Paytm offers acquiring devices like Soundbox, EDC, QR and Payment Gateway where payment aggregation is done through PPI and also other banks’ financial instruments. To further enhance merchants’ business, Paytm offers merchants commerce services through advertising and Paytm Mini app store. Operating on this platform leverage, the company then offers credit services such as merchant loans, personal loans and BNPL, sourced by its financial partners. About the Role: This position requires someone to work on complex technical projects and closely work with peers in an innovative and fast-paced environment. For this role, we require someone with a strong product design sense & specialized in Hadoop and Spark technologies. Requirements: Minimum 6+ years of experience in Big Data technologies. The position Grow our analytics capabilities with faster, more reliable tools, handling petabytes of data every day. Brainstorm and create new platforms that can help in our quest to make available to cluster users in all shapes and forms, with low latency and horizontal scalability. Make changes to our diagnosing any problems across the entire technical stack. Design and develop a real-time events pipeline for Data ingestion for real-time dash- boarding. Develop complex and efficient functions to transform raw data sources into powerful, reliable components of our data lake. Design & implement new components and various emerging technologies in Hadoop Eco- System, and successful execution of various projects. Be a brand ambassador for Paytm – Stay Hungry, Stay Humble, Stay Relevant! Skills that will help you succeed in this role: Fluent with Strong hands-on experience with Hadoop, MapReduce, Hive, Spark, PySpark etc. Excellent programming/debugging skills in Python/Scala. Experience with AWS services such as S3, EMR, Glue, Athena etc. Experience with Kafka. Experience with SQL. Experience with Jira, bitbucket, Jenkins. Experience with any scripting language such as Python, Bash etc. Good to have experience of working with noSQL databases like Hbase, Cassandra. Good to have hands-on programming experience with multithreaded applications. Good to have experience in developing streaming applications e.g. Spark Streaming, Flink, Storm, etc. Why join us Because you get an opportunity to make a difference, and have a great time doing that. You are challenged and encouraged here to do stuff that is meaningful for you and for those we serve. You should work with us if you think seriously about what technology can do for people. We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be. Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story!

Posted 17 hours ago

Apply

0 years

0 Lacs

India

Remote

Role: Support Specialist L3 Location: India About the Operations Team : Includes the activities, processes and practices involved in managing and maintaining the operational aspects of an organization’s IT infrastructure and systems. It focuses on ensuring the smooth and reliable operation of IT services, infrastructure components and supporting systems in the Data & Analytics area. Duties Description: Provide expert service support as L3 specialist for the service. Identify, analyze, and develop solutions for complex incidents or problems raised by stakeholders and clients as needed. Analyze issues and develop tools and/or solutions that will help enable business continuity and mitigate business impact. Proactive and timely update assigned tasks, provide response and solution within agreed team's timelines. Problem corrective action plan proposals. Deploying bug-fixes in managed applications. Gather requirements, analyze, design and implement complex visualization solutions Participate in internal knowledge sharing, collaboration activities, and service improvement initiatives. Tasks may include monitoring, incident/problem resolution, documentations, automation, assessment and implementation/deployment of change requests. Provide technical feedback and mentoring to teammates Requirements: Willing to work either ASIA, EMEA, or NALA shift Strong problem-solving, analytical, and critical thinking skills. Strong communication skillset – ability to translate technical details to business/non-technical stakeholders Extensive experience with SQL, T-SQL, PL/SQL Language – includes but not limited to ETL, merge, partition exchange, exception and error handling, performance tuning. Experience with Python/Pyspark mainly with Pandas, Numpy, Pathlib and PySpark SQL Functions Experience with Azure Fundamentals, particularly Azure Blob Storage (File Systems and AzCopy). Experience with Azure Data Services - Databricks and Data Factory Understands the operation of ETL process, triggers and scheduler Logging, dbutils, pyspark SQL functions, handling different files e.g. json Experience with Git repository maintenance and DevOps concepts. Familiarity with building, testing, and deploying process. Nice to have: Experience with Control-M (if no experience, required to learn on the job) KNIME Power BI Willing to be cross-trained to all of the technologies involved in the solution We offer: Stable employment. On the market since 2008, 1300+ talents currently on board in 7 global sites. “Office as an option” model. You can choose to work remotely or in the office. Flexibility regarding working hours and your preferred form of contract. Comprehensive online onboarding program with a “Buddy” from day 1. Cooperation with top-tier engineers and experts. Unlimited access to the Udemy learning platform from day 1. Certificate training programs. Lingarians earn 500+ technology certificates yearly. Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly. Grow as we grow as a company. 76% of our managers are internal promotions. A diverse, inclusive, and values-driven community. Autonomy to choose the way you work. We trust your ideas. Create our community together. Refer your friends to receive bonuses. Activities to support your well-being and health. Plenty of opportunities to donate to charities and support the environment. If you are interested in this position, please apply on the link given below. Application Link

Posted 17 hours ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Key Responsibilities: · Azure Cloud s Databricks: o Design and build efficient data pipelines using Azure Databricks (PySpark). o Implement business logic for data transformation and enrichment at scale. o Manage and optimize Delta Lake storage solutions. · API Development: o Develop REST APIs using FastAPI to expose processed data. o Deploy APIs on Azure Functions for scalable and serverless data access. · Data Orchestration s ETL: o Develop and manage Airflow DAGs to orchestrate ETL processes. o Ingest and process data from various internal and external sources on a scheduled basis. · Database Management: o Handle data storage and access using PostgreSQL and MongoDB. o Write optimized SQL queries to support downstream applications and analytics. · Collaboration: o Work cross-functionally with teams to deliver reliable, high-performance data solutions. o Follow best practices in code quality, version control, and documentation. Required Skills s Experience: · 5+ years of hands-on experience as a Data Engineer. · Strong experience with Azure Cloud services. · Proficient in Azure Databricks, PySpark, and Delta Lake. · Solid experience with Python and FastAPI for API development. · Experience with Azure Functions for serverless API deployments. · Skilled in managing ETL pipelines using Apache Airflow. · Hands-on experience with PostgreSQL and MongoDB. · Strong SQL skills and experience handling large datasets.

Posted 18 hours ago

Apply

4.0 years

0 Lacs

Haryana, India

On-site

What do we do? The TTS Analytics team provides analytical insights to the Product, Pricing, Client Experience and Sales functions within the global Treasury & Trade Services business. The team works on business problems focused on driving acquisitions, cross-sell, revenue growth & improvements in client experience. The team extracts relevant insights, identifies business opportunities, converts business problems into analytical frameworks, uses big data tools and machine learning algorithms to build predictive models & other solutions, and designs go-to-market strategies for a huge variety of business problems. Role Description The role will be Business Analytics Analyst 2 (C10) in the TTS Analytics team The role will report to the AVP/VP leading the team The role will involve working on multiple analyses through the year on business problems across the client life cycle – acquisition, engagement, client experience and retention – for the TTS business This will involve leveraging multiple analytical approaches, tools and techniques, working on multiple data sources (client profile & engagement data, transactions & revenue data, digital data, unstructured data like call transcripts etc.) to provide data driven insights to business and functional stakeholders Experience: Bachelor’s Degree with 4+ years of experience in data analytics, or Master’s Degree with 2+ years of experience in data analytics Must have: Marketing analytics experience Experience across different analytical methods like hypothesis testing, segmentation, time series forecasting, test vs. control comparison etc. Predictive modeling using Machine Learning Good to have: Experience in financial services Digital Marketing and/or Digital Experience domain knowledge Experience with unstructured data analysis, e.g. call transcripts, using Natural language Processing (NLP)/ Text Mining Skills: Analytical Skills: Proficient in formulating analytical methodology, identifying trends and patterns with data Has the ability to work hands-on to retrieve and manipulate data from big data environments Tools and Platforms: Proficient in Python/R, SQL Experience in PySpark, Hive and Scala Proficient in MS Excel, PowerPoint Good to have: Experience with Tableau Soft Skills: Strong analytical and problem-solving skills Excellent communication and interpersonal skills Be organized, detail oriented, and adaptive to matrix work environment ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Business Analysis ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 20 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Haryana, India

On-site

What do we do? The TTS Analytics team provides analytical insights to the Product, Pricing, Client Experience and Sales functions within the global Treasury & Trade Services business. The team works on business problems focused on driving acquisitions, cross-sell, revenue growth & improvements in client experience. The team extracts relevant insights, identifies business opportunities, converts business problems into analytical frameworks, uses big data tools and machine learning algorithms to build predictive models & other solutions, and designs go-to-market strategies for a huge variety of business problems. Role Description The role will be Business Analytics Analyst (C11) in the TTS Analytics team The role will report to the AVP/VP leading the team The role will involve working on multiple analyses through the year on business problems across the client life cycle – acquisition, engagement, client experience and retention – for the TTS business This will involve leveraging multiple analytical approaches, tools and techniques, working on multiple data sources (client profile & engagement data, transactions & revenue data, digital data, unstructured data like call transcripts etc.) to provide data driven insights to business and functional stakeholders Qualifications Experience: Bachelor’s or Master’s Degree with 5-8 years of experience in data analytics Must have: Marketing analytics experience Experience on business problems around sales/marketing strategy optimization, pricing optimization, client experience, cross-sell and retention Experience across different analytical methods like hypothesis testing, segmentation, time series forecasting, test vs. control comparison etc. Predictive modeling using Machine Learning Good to have: Experience in financial services Experience working with data from different sources and of different complexity Experience with unstructured data analysis, e.g. call transcripts, using Natural language Processing (NLP)/ Text Mining Skills: Analytical Skills: Strong logical reasoning and problem solving ability Proficient in converting business problems into analytical tasks, and analytical findings into business insights Proficient in formulating analytical methodology, identifying trends and patterns with data Has the ability to work hands-on to retrieve and manipulate data from big data environments Tools and Platforms: Proficient in Python/R, SQL Experience in PySpark, Hive and Scala Proficient in MS Excel, PowerPoint Good to have: Experience with Tableau Soft Skills: Ability to identify, clearly articulate and solve complex business problems and present them to the management in a structured and simpler form Should have excellent communication and inter-personal skills Strong process/project management skills Ability to coach and mentor juniors Contribute to organizational initiatives in wide ranging areas including competency development, training, organizational building activities etc. ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Business Analysis ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 20 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Haryana, India

On-site

What do we do? The TTS Analytics team provides analytical insights to the Product, Pricing, Client Experience and Sales functions within the global Treasury & Trade Services business. The team works on business problems focused on driving acquisitions, cross-sell, revenue growth & improvements in client experience. The team extracts relevant insights, identifies business opportunities, converts business problems into analytical frameworks, uses big data tools and machine learning algorithms to build predictive models & other solutions, and designs go-to-market strategies for a huge variety of business problems. Role Description The role will be Business Analytics Analyst (C11) in the TTS Analytics team The role will report to the AVP/VP leading the team The role will involve working on multiple analyses through the year on business problems across the client life cycle – acquisition, engagement, client experience and retention – for the TTS business This will involve leveraging multiple analytical approaches, tools and techniques, working on multiple data sources (client profile & engagement data, transactions & revenue data, digital data, unstructured data like call transcripts etc.) to provide data driven insights to business and functional stakeholders Qualifications Experience: Bachelor’s or Master’s Degree with 5-8 years of experience in data analytics Must have: Marketing analytics experience Experience on business problems around sales/marketing strategy optimization, pricing optimization, client experience, cross-sell and retention Experience across different analytical methods like hypothesis testing, segmentation, time series forecasting, test vs. control comparison etc. Predictive modeling using Machine Learning Good to have: Experience in financial services Experience working with data from different sources and of different complexity Experience with unstructured data analysis, e.g. call transcripts, using Natural language Processing (NLP)/ Text Mining Skills: Analytical Skills: Strong logical reasoning and problem solving ability Proficient in converting business problems into analytical tasks, and analytical findings into business insights Proficient in formulating analytical methodology, identifying trends and patterns with data Has the ability to work hands-on to retrieve and manipulate data from big data environments Tools and Platforms: Proficient in Python/R, SQL Experience in PySpark, Hive and Scala Proficient in MS Excel, PowerPoint Good to have: Experience with Tableau Soft Skills: Ability to identify, clearly articulate and solve complex business problems and present them to the management in a structured and simpler form Should have excellent communication and inter-personal skills Strong process/project management skills Ability to coach and mentor juniors Contribute to organizational initiatives in wide ranging areas including competency development, training, organizational building activities etc. ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Business Analysis ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 20 hours ago

Apply

4.0 years

0 Lacs

Haryana, India

On-site

What do we do? The TTS Analytics team provides analytical insights to the Product, Pricing, Client Experience and Sales functions within the global Treasury & Trade Services business. The team works on business problems focused on driving acquisitions, cross-sell, revenue growth & improvements in client experience. The team extracts relevant insights, identifies business opportunities, converts business problems into analytical frameworks, uses big data tools and machine learning algorithms to build predictive models & other solutions, and designs go-to-market strategies for a huge variety of business problems. Role Description The role will be Data/Information Mgt Analyst 2 (C10) in the TTS Analytics team The role will report to the AVP/VP leading the team The role will involve working on multiple analyses through the year on business problems across the client life cycle – acquisition, engagement, client experience and retention – for the TTS business The work involves setting up and optimizing data pipelines using big data technologies such as PySpark, Scala, and Hive. The role will also include working with SQL and NoSQL databases (e.g., MongoDB) to manage and retrieve data effectively. The role requires designing and deploying interactive Tableau dashboards to visualize data insights and provide stakeholders with actionable information using features such as Tableau Prep Flows, Level of Detail (LOD) Expressions, Table Calculations etc. This will involve leveraging multiple analytical approaches, tools and techniques, working on multiple data sources (client profile & engagement data, transactions & revenue data, digital data, unstructured data like call transcripts etc.) to enable data driven insights to business and functional stakeholders Experience: Bachelor’s Degree with 4+ years of experience in data analytics, or Master’s Degree with 2+ years of experience in data analytics Must have: Marketing analytics experience Proficiency in designing and deploying Tableau dashboards Strong experience in data engineering and building data pipelines Experience with big data technologies such as PySpark, Scala, and Hive Proficiency in SQL and experience with various database systems (e.g., MongoDB) Good to have: Experience in financial services Experience across different analytical methods like hypothesis testing, segmentation, time series forecasting, test vs. control comparison etc. Skills: Analytical Skills: Strong analytical and problem-solving skills related to data manipulation and pipeline optimization Has the ability to work hands-on to retrieve and manipulate data from big data environments Ability to design efficient data models and schemas Tools and Platforms: Proficient in Python/R, SQL Experience in PySpark, Hive, and Scala Strong knowledge of SQL and NoSQL databases such as MongoDB etc. Proficiency with Tableau (designing and deploying advanced, interactive dashboards) Proficient in MS Office Tools such Excel and PowerPoint Soft Skills: Strong analytical and problem-solving skills Excellent communication and interpersonal skills Be organized, detail oriented, and adaptive to matrix work environment ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 20 hours ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

A Career at HARMAN - Harman Tech Solutions (HTS) You will be part of a global, multi-disciplinary team dedicated to harnessing the power of technology and shaping the future. At HARMAN HTS, your role involves solving challenges through creative and innovative solutions that combine physical and digital elements to address various needs. Your responsibilities will include: - Developing and executing test scripts to validate data pipelines, transformations, and integrations. - Formulating and maintaining test strategies, including smoke, performance, functional, and regression testing, to ensure data processing and ETL jobs align with requirements. - Collaborating with development teams to assess changes in data workflows, updating test cases as needed to maintain data integrity. - Designing and implementing tests for data validation, storage, and retrieval using Azure services like Data Lake, Synapse, and Data Factory. - Continuously enhancing automated tests to ensure timely delivery of new features per defined quality standards. - Participating in data reconciliation and verifying Data Quality frameworks to uphold data accuracy, completeness, and consistency. - Sharing knowledge and best practices by documenting testing processes and findings in collaboration with business analysts and technology teams. - Communicating testing progress effectively with stakeholders, addressing issues or blockers, and ensuring alignment with business objectives. - Maintaining a comprehensive understanding of the Azure Data Lake platform's data landscape to ensure thorough testing coverage. Requirements: - 3-6 years of QA experience with a strong focus on Big Data testing, particularly in Data Lake environments on Azure's cloud platform. - Proficiency in Azure Data Factory, Azure Synapse Analytics, and Databricks for big data processing and scaled data quality checks. - Strong SQL skills for writing and optimizing simple and complex queries for data validation and testing. - Proficient in PySpark, with experience in data manipulation, transformation, and executing test scripts for data processing and validation. - Hands-on experience with Functional & system integration testing in big data environments, ensuring seamless data flow and accuracy across systems. - Knowledge and ability to design and execute test cases in a behavior-driven development environment. - Fluency in Agile methodologies, active participation in Scrum ceremonies, and a strong understanding of Agile principles. - Familiarity with tools like Jira, including experience with X-Ray or Jira Zephyr for defect management and test case management. - Proven experience working on high-traffic and large-scale software products, ensuring data quality, reliability, and performance under demanding conditions. What We Offer: - Access to employee discounts on HARMAN/Samsung products. - Professional development opportunities through HARMAN University. - Flexible work schedule promoting work-life integration and collaboration in a global environment. - Inclusive and diverse work environment fostering professional and personal development. - Tuition reimbursement. - Be Brilliant employee recognition and rewards program. You Belong Here: HARMAN values every employee and encourages sharing ideas and perspectives within a supportive culture that celebrates uniqueness. Continuous learning and development opportunities are offered to empower you in shaping your career. About HARMAN: HARMAN has a rich legacy of innovation, amplifying the sense of sound since the 1920s. We create integrated technology platforms that enhance safety, connectivity, and intelligence across automotive, lifestyle, and digital transformation solutions. By exceeding engineering and design standards, we offer extraordinary experiences under iconic brands like JBL, Mark Levinson, and Revel. Join us in innovating and making a lasting impact. Important Notice: Beware of recruitment scams.,

Posted 20 hours ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Solution Designer (Cloud Data Integration) at Barclays within the Customer Digital and Data Business Area, you will play a vital role in supporting the successful delivery of location strategy projects. Your responsibilities will include ensuring projects are delivered according to plan, budget, quality standards, and governance protocols. By spearheading the evolution of the digital landscape, you will drive innovation and excellence, utilizing cutting-edge technology to enhance our digital offerings and deliver unparalleled customer experiences. To excel in this role, you should possess hands-on experience working with large-scale data platforms and developing cloud solutions within the AWS data platform. Your track record should demonstrate a history of driving business success through your expertise in AWS, distributed computing paradigms, and designing data ingestion programs using technologies like Glue, Lambda, S3, Redshift, Snowflake, Apache Kafka, and Spark Streaming. Proficiency in Python, PySpark, SQL, and database management systems is essential, along with a strong understanding of data governance principles and tools. Additionally, valued skills for this role may include experience in multi-cloud solution design, data modeling, data governance frameworks, agile methodologies, project management tools, business analysis, and product ownership within a data analytics context. A basic understanding of the banking domain, along with excellent analytical, communication, and interpersonal skills, will be crucial for success in this position. Your main purpose as a Solution Designer will involve designing, developing, and implementing solutions to complex business problems by collaborating with stakeholders to understand their needs and requirements. You will be accountable for designing solutions that balance technology risks against business delivery, driving consistency and aligning with modern software engineering practices and automated delivery tooling. Furthermore, you will be expected to provide impact assessments, fault finding support, and architecture inputs required to comply with the bank's governance processes. As an Assistant Vice President in this role, you will be responsible for advising on decision-making processes, contributing to policy development, and ensuring operational effectiveness. If the position involves leadership responsibilities, you will lead a team to deliver impactful work and set objectives for employees while demonstrating leadership behaviours focused on listening, inspiring, aligning, and developing others. Alternatively, as an individual contributor, you will lead collaborative assignments, guide team members, identify new directions for projects, consult on complex issues, and collaborate with other areas to support business activities. All colleagues at Barclays are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset to Empower, Challenge, and Drive. By demonstrating these values and mindset, you will contribute to creating an environment where colleagues can thrive and deliver consistently excellent results.,

Posted 20 hours ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

You will collaborate with business, platform, and technology stakeholders to understand the scope of projects. Your role will involve performing comprehensive exploratory data analysis at various levels of granularity to derive inferences for further solutioning, experimentation, and evaluation. You will design, develop, and deploy robust enterprise AI solutions using Generative AI, NLP, machine learning, and other relevant technologies. It will be essential to continuously focus on providing business value while ensuring technical sustainability. Additionally, you will promote and drive adoption of cutting-edge data science and AI practices within the team while staying up to date on relevant technologies to propel the team forward. We are seeking a team player with 4-7 years of experience in the field of data science and AI. The ideal candidate will have proficiency in programming/querying languages like Python, SQL, PySpark, and familiarity with Azure cloud platform tools such as Databricks, ADF, Synapse, Web App, among others. You should possess strong work experience in text analytics, NLP, and Generative AI, showcasing a scientific and analytical thinking mindset comfortable with brainstorming and ideation. A deep interest in driving business outcomes through AI/ML is crucial, alongside a bachelor's or master's degree in engineering or computer science with or without a specialization in the field of AI/ML. Strong business acumen and the desire to collaborate with business teams to solve problems are highly valued. Prior understanding of the business domain of shipping and logistics is considered advantageous. Should you require any adjustments during the application and hiring process, we are happy to support you. For special assistance or accommodations to use our website, apply for a position, or perform a job, please contact us at accommodationrequests@maersk.com.,

Posted 20 hours ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Engineer, you will be responsible for designing and developing scalable data pipelines and notebooks using Microsoft Fabric or Synapse Analytics. You should have a research-oriented mindset in Data Projects, thinking outside of the box and focusing on future needs. Your role will involve building and managing Lakehouses and Data Warehouses using Fabrics OneLake architecture, integrating data from diverse sources into Fabric, and collaborating with BI developers for seamless integration with Power BI and other reporting tools. Additionally, you will be required to implement data governance, security, and compliance within the Fabric ecosystem, optimize data storage and processing for performance and cost-efficiency, monitor and troubleshoot data workflows to ensure high data quality and reliability, and document architecture, data models, and processes. It is essential to have experience in automated functional testing along with development. Key Skills required for this role include Pyspark, Data Modelling, Spark SQL, and proficiency in Microsoft Fabric, including an understanding of Shortcuts, Mirroring, Data flows, and all the features. Familiarity with Data Ingestion Design Patterns is also a desired skill for this position.,

Posted 21 hours ago

Apply

5.0 - 13.0 years

0 Lacs

karnataka

On-site

Job Description: Essential Skills: Proficiency in Cloud-PaaS-GCP-Google Cloud Platform. Experience Required: 5-8 years. Position: Cloud Data Engineer. Work Location: Wipro, PAN India. Work Arrangement: Hybrid model with 3 days in Wipro office. Additional Experience: 8-13 years. You will be responsible for demonstrating strong expertise in SQL and proficiency in Python. You should possess excellent knowledge of any cloud technology, such as AWS, Azure, or GCP, with a preference for GCP. Familiarity with PySpark is preferred. Join us at Wipro, where we are building a modern organization with bold ambitions in digital transformation. We are seeking individuals who are inspired by reinvention and are eager to evolve themselves, their careers, and their skills to drive the constant evolution of our business and industry. At Wipro, you will have the opportunity to design your own reinvention and realize your ambitions in a purpose-driven environment. We welcome applications from individuals with disabilities.,

Posted 21 hours ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Machine Learning Engineer, you will play a key role in developing and enhancing a Telecom Artificial Intelligence Product. This role requires a strong background in machine learning and deep learning, along with extensive experience in implementing advanced algorithms and models to solve complex problems. You will be working on cutting-edge technologies to develop solutions for anomaly detection, forecasting, event correlation, and fraud detection. Your responsibilities will include developing production-ready implementations of proposed solutions using various machine learning and deep learning algorithms. You will test these solutions on live customer data to ensure efficacy and robustness. Additionally, you will research and test novel machine learning approaches for large-scale distributed computing applications. In this role, you will be responsible for implementing and managing the full machine learning operations lifecycle using tools such as Kubeflow, MLflow, AutoML, and Kserve for model deployment. You will develop and deploy machine learning models using PyTorch and TensorFlow to ensure high performance and scalability. Furthermore, you will run and manage PySpark and Kafka on distributed systems with large-scale, non-linear network elements. To excel in this position, you should be proficient in Python programming and experienced with machine learning libraries such as Scikit-Learn and NumPy. Experience in time series analysis, data mining, text mining, and creating data architectures will be beneficial. You should also be able to utilize batch processing and incremental approaches to manage and analyze large datasets. As a Machine Learning Engineer, you will experiment with multiple algorithms, optimizing hyperparameters to identify the best-performing models. You will execute machine learning algorithms in cloud environments, leveraging cloud resources effectively. Continuous feedback gathering, model retraining, and updating will be essential to maintain and improve model performance. Moreover, you should have expertise in network characteristics, transformer architectures, GAN AI techniques, and end-to-end machine learning projects. Experience with leading supervised and unsupervised machine learning methods and familiarity with Python packages like Pandas, Numpy, and DL frameworks like Keras, TensorFlow, PyTorch are required. Knowledge of Big Data tools and environments, as well as MySQL/NoSQL databases, will be advantageous. You will collaborate with cross-functional teams of data scientists, software engineers, and stakeholders to integrate implemented systems into the SaaS platform. Your innovative thinking and creative ideas will contribute to improving the overall platform. Additionally, you will create use cases specific to the domain to solve business problems effectively. Ideally, you should have a Bachelor's degree in Science/IT/Computing or equivalent with at least 4 years of experience in a QA Engineering role. Strong quantitative and applied mathematical skills are essential, along with certification courses in Data Science/ML. In-depth knowledge of statistical techniques, machine learning techniques, and experience with Telecom Product development are preferred. Experience in MLOps is a plus for deploying developed models, and familiarity with scalable SaaS platforms is advantageous for this role.,

Posted 21 hours ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The Business Analytics Int Analyst is a developing professional role with the autonomy to solve complex problems independently. You will utilize your in-depth specialty knowledge and industry understanding to integrate with the team and other functions effectively. Your analytical thinking and proficiency in data analysis tools will be crucial in making informed judgments and recommendations based on factual information. Your role will involve dealing with variable issues that may have broader business impacts, requiring professional judgment in interpreting data and results. You will also need to communicate complex information in a systematic and understandable manner, demonstrating developed communication and diplomacy skills. The quality and timeliness of the service you provide will directly impact the effectiveness of your team and other closely related teams. Responsibilities: - Gather operational data from various cross-functional stakeholders to analyze past business performance. - Identify data patterns and trends to provide insights that enhance business decision-making capabilities. - Recommend actions for future developments, strategic business opportunities, and operational policy enhancements. - Translate data into consumer or customer behavioral insights to drive targeting and segmentation strategies. - Continuously improve processes and strategies by evaluating new data sources, tools, and capabilities. - Collaborate with internal and external business partners to build, implement, track, and enhance decision strategies. - Assess risks appropriately when making business decisions, ensuring compliance with laws, rules, and regulations. Role Description: - The role of Spec Analytics Intmd Analyst (C11) in the TTS Analytics team. - Reporting to the AVP or VP leading the team. - Working on multiple analyses throughout the year on business problems across the client life cycle for the TTS business. - Leveraging various analytical approaches, tools, and techniques to provide data-driven insights to business partners. - Contributing to ideation on analytical projects to tackle strategic business priorities. - Embracing ambiguity and open-ended questions as a core part of the team's work. Qualifications: Experience: - Bachelor's Degree with 6-8 years or Master's Degree with 4-6 years of experience in data analytics, or a relevant PhD. - Substantial experience in identifying and resolving business problems, utilizing text data, developing analytical tools, applying predictive modeling techniques, and working with diverse data sources. Skills: Analytical Skills: - Strong logical reasoning and problem-solving ability. - Proficiency in converting business problems into analytical tasks and deriving business insights from analytical findings. - Ability to work hands-on with data from big data environments. Tools and Platforms: - Prior experience with Graph databases like Neo4j and Vector database. - Proficiency in Python, PySpark, Hive, MS Excel, and PowerPoint. - Experience with PySpark and Tableau is a plus.,

Posted 22 hours ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

As a professional services firm affiliated with an international network, our client in India has been providing comprehensive services since September 1993. With a presence in major cities such as Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, and Vadodara, our professionals possess in-depth knowledge of local laws, regulations, markets, and competition. We cater to a diverse range of national and international clients across various sectors, offering rapid, performance-based, industry-focused, and technology-enabled services that draw on both global expertise and local insights. Responsibilities: - Develop and optimize data pipelines utilizing Python and Pyspark - Conduct data analysis and produce insightful reports - Create and manage SQL queries for efficient data extraction Requirements: - Proficient in Python, Pyspark, and SQL - Hands-on experience in data analysis and pipeline development - Possess strong analytical and problem-solving skills Join our team and enjoy the following benefits: - Opportunity to work with one of the Big 4 firms in India - Embrace a healthy work environment that fosters growth and collaboration - Maintain a good work-life balance while contributing to exciting projects If you are passionate about data, analytics, and technology, and are looking to be part of a dynamic and innovative team, we welcome you to explore this exciting opportunity with us.,

Posted 22 hours ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

The Forward Deployed Engineer (FDE) - AI Enablement is a senior-level position where you will be responsible for achieving results through solutioning, developing, and implementing AI/ML capabilities as an individual contributor. As an FDE - AI Enablement, you are expected to stay updated on the latest developments in Citi and your field, contributing to the directional strategy by identifying applications within your organization and the business. Your key responsibilities will include enhancing the current toolset with AI/ML capabilities, solutioning for identified use cases, building quick proof-of-concept to showcase solutions, collaborating with the larger AI/ML stream across the data organization, and utilizing your in-depth knowledge and skills across multiple Applications Development areas to provide technical oversight across systems and applications. Additionally, you will be involved in formulating strategies for applications development and other functional areas, gaining comprehensive knowledge of how different business areas integrate to achieve business objectives, and providing evaluative judgment based on the analysis of factual data in complex and unique situations. To be successful in this role, you should have at least 10 years of experience in AI/ML, preferably within the financial industry. Hands-on experience in Python and utilizing ML Libraries such as TensorFlow, Scikit-learn, and Pytorch for Big Data is required. You should also have practical experience with LLMs and GenAI APIs (such as OpenAI GPT, Bard, etc.), applying Machine Learning or NLP models in real-time applications for large datasets, working with MLOPs and Model Deployment Pipelines, using PySpark and understanding big data concepts. Knowledge in Azure OpenAI, Google Vertex, and Stellar is considered a plus. Stakeholder management experience, demonstrated leadership skills, and proven project management skills are also essential for this role. A Bachelor's/University degree is required, with a Master's degree being preferred. This position falls under the Technology job family group and specifically in the Applications Support job family. It is a full-time role. If you require a reasonable accommodation due to a disability to use our search tools or apply for a career opportunity, please review the Accessibility at Citi information. For more details on Citis EEO Policy Statement and the Know Your Rights poster, please refer to the respective documents.,

Posted 22 hours ago

Apply

12.0 - 16.0 years

0 Lacs

pune, maharashtra

On-site

The Engineering Lead Analyst is a strategic professional who stays abreast of developments within own field and contributes to directional strategy by considering their application in own job and the business. Recognized technical authority for an area within the business. This position is for the lead role in Client Financials Improvements project. Selected candidate will be responsible for development and execution of project within ISG Data Platform group. The successful candidate will be working closely with the global team, to interface the business, translating business requirements into technical requirements and will have strong functional knowledge from banking and financial system. Lead the definition and ongoing management of target application architecture for Client Financials. Leverage internal and external leading practices and liaising with other Citi risk organizations to determine and maintain appropriate alignment, specifically with Citi Data Standards. Establish a governance process to oversee implementation activities and ensure ongoing alignment to the defined architecture. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 12-16 years experience in analyzing and defining risk management data structures Skills: - Strong working experience in Python & PySpark - Prior working experience in writing APIs / MicroServices development - Hands-on experience of writing SQL queries in multiple database environments and OS; Experience in validating end to end flow of data in an application. - Hands on experience in working with SQL and NoSQL databases. - Working experience with Airflow and other Orchestrator - Experience in Design and Architect of application - Assess the list of packaged applications and define the re-packaging approach - Understanding of Capital markets (risk management process), Loans / CRMS required - Knowledge of process automation and engineering will be plus. - Demonstrated influencing, facilitation and partnering skills - Track record of interfacing with and presenting results to senior management - Experience with all phases of Software Development Life Cycle - Strong stakeholder engagement skills - Organize and attend workshops to understand the current state of Client Financials - Proven aptitude for organizing and prioritizing work effectively (Must be able to meet deadlines) - Propose a solution and deployment approach to achieve the goals. Citi is an equal opportunity and affirmative action employer. Citigroup Inc. and its subsidiaries ("Citi) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi.,

Posted 23 hours ago

Apply

3.0 - 7.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You will be responsible for developing and maintaining scalable data processing systems using Apache Spark and Azure Databricks. This includes implementing data integration from various sources such as RDBMS, ERP systems, and files. You will design and optimize SQL queries, stored procedures, and relational schemas. Additionally, you will build stream-processing systems using technologies like Apache Storm or Spark-Streaming, and utilize messaging systems like Kafka or RabbitMQ for data ingestion. Performance tuning of Spark jobs for optimal efficiency will be a key focus area. Collaboration with cross-functional teams to deliver high-quality data solutions is essential in this role. You will also lead and mentor a team of data engineers, fostering a culture of continuous improvement and Agile practices. Key skills required for this position include proficiency in Apache Spark and Azure Databricks, strong experience with Azure ecosystem and Python, as well as working knowledge of Pyspark (Nice-to-Have). Experience in data integration from varied sources, expertise in SQL optimization and stream-processing systems, familiarity with Kafka or RabbitMQ, and the ability to lead and mentor engineering teams are also crucial. A strong understanding of distributed computing principles is a must. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field.,

Posted 23 hours ago

Apply

3.0 - 7.0 years

0 Lacs

jaipur, rajasthan

On-site

As a skilled Product Analyst, you will play a crucial role in analyzing the performance of our products, identifying trends, making recommendations, and generating reports. Your primary focus will involve closely collaborating with various teams to gain insights into product usage, feature adoption, customer behavior, and overall business performance. Your responsibilities will include collaborating across departments such as product, engineering, and marketing to shape product strategy and roadmap. You will leverage tools like Google Analytics, Metabase, Tableau, and Google Data Studio for data visualization and dashboard creation. Additionally, your expertise in SQL, Python, and Jupyter Notebook will be essential for data manipulation and analysis. Your analytical prowess, backed by quantitative skills and the ability to utilize data and metrics for decision-making and business case development, will be instrumental in this role. Experience with Google Big Query, GCP, or any cloud architecture will be beneficial. A Bachelor's degree in a related field, such as Statistics, Computer Science, or Business Analytics, is required. Your role will also involve developing comprehensive reports on product performance, sharing insights with stakeholders, and analyzing product data to extract valuable trends and insights. Your attention to detail and problem-solving skills will be key in ensuring the success of our product analytics initiatives.,

Posted 23 hours ago

Apply

7.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a Lead Data Engineer with 7-12 years of experience, you will be an integral part of our team, contributing significantly to the design, development, and maintenance of our data infrastructure. Your primary responsibilities will revolve around creating and managing robust data architectures, ETL processes, data warehouses, and utilizing big data and cloud technologies to support our business intelligence and analytics needs. You will lead the design and implementation of data architectures that facilitate data warehousing, integration, and analytics platforms. Developing and optimizing ETL pipelines will be a key aspect of your role, ensuring efficient processing of large datasets and implementing data transformation and cleansing processes to maintain data quality. Your expertise will be crucial in building and maintaining scalable data warehouse solutions using technologies such as Snowflake, Databricks, or Redshift. Additionally, you will leverage AWS Glue and PySpark for large-scale data processing, manage data pipelines with Apache Airflow, and utilize cloud platforms like AWS, Azure, and GCP for data storage, processing, and analytics. Establishing data governance and security best practices, ensuring data integrity, accuracy, and availability, and implementing monitoring and alerting systems are vital components of your responsibilities. Collaborating closely with stakeholders, mentoring junior engineers, and leading data-related projects will also be part of your role. Furthermore, your technical skills should include proficiency in ETL tools like Informatica Power Center, Python, PySpark, SQL, RDBMS platforms, and data warehousing concepts. Soft skills such as excellent communication, leadership, problem-solving, and the ability to manage multiple projects effectively will be essential for success in this role. Preferred qualifications include experience with machine learning workflows, certification in relevant data engineering technologies, and familiarity with Agile methodologies and DevOps practices. Location: Hyderabad Employment Type: Full-time,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

You will join Salesforce, a company that aims to inspire the future of business through the integration of AI, data, and CRM. Upholding core values, Salesforce empowers companies across various industries to innovate and engage with customers in unique ways. As a member of the team, you will have the opportunity to become a Trailblazer, enhancing your performance, advancing your career, and contributing to positive change in the world. As an experienced Data Scientist, your primary responsibility will be to develop marketing attribution, causal inference, and uplift models to enhance the efficiency and effectiveness of marketing initiatives. This role will involve designing experiments and ensuring a consistent approach to experimentation and campaign measurement across marketing, customer engagement, and digital use cases. The ideal candidate will possess extensive experience in creating statistical models and AI/ML algorithms for marketing and digital optimization on large-scale datasets within a cloud environment. Rigorous testing and evaluation of algorithm performance will be essential, both during development and in production. Moreover, a deep understanding of statistical and machine learning techniques is necessary, along with a commitment to the ethical use of data in algorithm design. Key Responsibilities: - Develop statistical and machine learning models to enhance marketing effectiveness, including attribution models, causal inference models, and uplift models. - Create optimization and simulation algorithms to optimize marketing spend across channels and improve ROI. - Lead the entire model development lifecycle from ideation to deployment, monitoring, and tuning. - Design experiments to support marketing, customer experience, and digital campaigns, and collaborate with peers to establish consistent experimentation and measurement approaches. - Cultivate strong cross-functional relationships and collaborate with key partners throughout the organization. - Stay updated on innovations in enterprise SaaS, AdTech, paid media, data science, customer data, and analytics fields. Required Skills: - 8+ years of experience in designing models for marketing optimization using statistical and machine learning techniques. - Proficiency in developing advanced statistical techniques for experiment design and causal inference methods. - Expertise in programming languages such as Python, R, PySpark, Java, and SQL. - Experience with cloud platforms like GCP and AWS for model development and deployment is preferred. - Strong quantitative reasoning skills and the ability to provide data-driven business insights. - Excellent written and verbal communication skills with a collaborative mindset. - Ability to simplify complex problems and a creative approach to finding solutions. - B2B customer data experience and knowledge of Salesforce products are advantageous.,

Posted 1 day ago

Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

As a Digital Product Engineering company, Nagarro is seeking a talented individual to join our dynamic and non-hierarchical work culture as a Data Engineer. With over 17500 experts across 39 countries, we are scaling in a big way and are looking for someone with 10+ years of total experience to contribute to our team. **Requirements:** - The ideal candidate should possess strong working experience in Data Engineering and Big Data platforms. - Hands-on experience with Python and PySpark is required. - Expertise with AWS Glue, including Crawlers and Data Catalog, is essential. - Experience with Snowflake and a strong understanding of AWS services such as S3, Lambda, Athena, SNS, and Secrets Manager are necessary. - Familiarity with Infrastructure-as-Code (IaC) tools like CloudFormation and Terraform is preferred. - Strong experience with CI/CD pipelines, preferably using GitHub Actions, is a plus. - Working knowledge of Agile methodologies, JIRA, and GitHub version control is expected. - Exposure to data quality frameworks, observability, and data governance tools and practices is advantageous. - Excellent communication skills and the ability to collaborate effectively with cross-functional teams are essential for this role. **Responsibilities:** - Writing and reviewing high-quality code to meet technical requirements. - Understanding clients" business use cases and converting them into technical designs. - Identifying and evaluating different solutions to meet clients" requirements. - Defining guidelines and benchmarks for Non-Functional Requirements (NFRs) during project implementation. - Developing design documents explaining the architecture, framework, and high-level design of applications. - Reviewing architecture and design aspects such as extensibility, scalability, security, design patterns, user experience, and NFRs. - Designing overall solutions for defined functional and non-functional requirements and defining technologies, patterns, and frameworks. - Relating technology integration scenarios and applying learnings in projects. - Resolving issues raised during code/review through systematic analysis of the root cause. - Conducting Proof of Concepts (POCs) to ensure suggested designs/technologies meet requirements. **Qualifications:** - Bachelors or master's degree in computer science, Information Technology, or a related field is required. If you are passionate about Data Engineering, experienced in working with Big Data platforms, proficient in Python and PySpark, and have a strong understanding of AWS services and Infrastructure-as-Code tools, we invite you to join Nagarro and be part of our innovative team.,

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies