Home
Jobs
Companies
Resume

5317 Pyspark Jobs - Page 47

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

15 - 22 Lacs

Bengaluru

Hybrid

Naukri logo

Job Summary: We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark, and SQL, to join our data team. Youll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities: Design and implement ETL/ELT pipelines using Databricks and PySpark. Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop high-performance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines. Required Skills: Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong hands-on skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt. Exposure to streaming data and real-time processing. Knowledge of DevOps practices for data engineering.

Posted 1 week ago

Apply

3.0 years

0 Lacs

India

On-site

Linkedin logo

Note: Please do not apply if your salary expectations are higher than the provided Salary Range and experience less than 3 years. If you have experience with Travel Industry and worked on Hotel, Car Rental or Ferry Booking before then we can negotiate the package. Company Description Our company is involved in promoting Greece for the last 25 years through travel sites visited from all around the world with 10 million visitors per year such www.greeka.com, www.ferriesingreece.com etc Through the websites, we provide a range of travel services for a seamless holiday experience such online car rental reservations, ferry tickets, transfers, tours etc….. Role Description We are seeking a highly skilled Artificial Intelligence / Machine Learning Engineer to join our dynamic team. You will work closely with our development team and QAs to deliver cutting-edge solutions that improve our candidate screening and employee onboarding processes. Major Responsibilities & Job Requirements include: • Develop and implement NLP/LLM Models. • Minimum of 3-4 years of experience as an AI/ML Developer or similar role, with demonstrable expertise in computer vision techniques. • Develop and implement AI models using Python, TensorFlow, and PyTorch. • Proven experience in computer vision, including fine-tuning OCR models (e.g., Tesseract, Layoutlmv3 , EasyOCR, PaddleOCR, or custom-trained models). • Strong understanding and hands-on experience with RAG (Retrieval-Augmented Generation) architectures and pipelines for building intelligent Q&A, document summarization, and search systems. • Experience working with LangChain, LLM agents, and chaining tools to build modular and dynamic LLM workflows. • Familiarity with agent-based frameworks and orchestration of multi-step reasoning with tools, APIs, and external data sources. • Familiarity with Cloud AI Solutions, such as IBM, Azure, Google & AWS. • Work on natural language processing (NLP) tasks and create language models (LLM) for various applications. • Design and maintain SQL databases for storing and retrieving data efficiently. • Utilize machine learning and deep learning techniques to build predictive models. • Collaborate with cross-functional teams to integrate AI solutions into existing systems. • Stay updated with the latest advancements in AI technologies, including ChatGPT, Gemini, Claude, and Big Data solutions. • Write clean, maintainable, and efficient code when required. • Handle large datasets and perform big data analysis to extract valuable insights. • Fine-tune pre-trained LLMs using specific type of data and ensure optimal performance. • Proficiency in cloud services from Amazon AWS • Extract and parse text from CVs, application forms, and job descriptions using advanced NLP techniques such as Word2Vec, BERT, and GPT-NER. • Develop similarity functions and matching algorithms to align candidate skills with job requirements. • Experience with microservices, Flask, FastAPI, Node.js. • Expertise in Spark, PySpark for big data processing. • Knowledge of advanced techniques such as SVD/PCA, LSTM, NeuralProphet. • Apply debiasing techniques to ensure fairness and accuracy in the ML pipeline. • Experience in coordinating with clients to understand their needs and delivering AI solutions that meet their requirements. Qualifications : • Bachelor's or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field. • In-depth knowledge of NLP techniques and libraries, including Word2Vec, BERT, GPT, and others. • Experience with database technologies and vector representation of data. • Familiarity with similarity functions and distance metrics used in matching algorithms. • Ability to design and implement custom ontologies and classification models. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration skills. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

hackajob is collaborating with Comcast to connect them with exceptional tech professionals for this role. Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary We are looking for an experienced and proactive ETL Lead to oversee and guide our ETL testing and data validation efforts. This role requires a deep understanding of ETL processes, strong technical expertise in tools such as SQL, Oracle, MongoDB, AWS, and Python/Pyspark, and proven leadership capabilities. The ETL Lead will be responsible for ensuring the quality, accuracy, and performance of our data pipelines while mentoring a team of testers and collaborating with cross-functional stakeholders. Job Description Key Responsibilities: Lead the planning, design, and execution of ETL testing strategies across multiple projects. Oversee the development and maintenance of test plans, test cases, and test data for ETL processes. Ensure data integrity, consistency, and accuracy across all data sources and destinations. Collaborate with data engineers, developers, business analysts, and project managers to define ETL requirements and testing scope. Mentor and guide a team of ETL testers, providing technical direction and support. Review and approve test deliverables and ensure adherence to best practices and quality standards. Identify and resolve complex data issues, bottlenecks, and performance challenges. Drive continuous improvement in ETL testing processes, tools, and methodologies. Provide regular status updates, test metrics, and risk assessments to stakeholders. Stay current with emerging trends and technologies in data engineering and ETL testing. Requirements 6+ years of experience in ETL testing, with at least 2 years in a lead or senior role. Strong expertise in ETL concepts, data warehousing, and data validation techniques. Hands-on experience with Oracle, MongoDB, AWS services (e.g., S3, Redshift, Glue), and Python/Pyspark scripting. Advanced proficiency in SQL and other query languages. Proven ability to lead and mentor a team of testers. Excellent problem-solving, analytical, and debugging skills. Strong communication and stakeholder management abilities. Experience with Agile/Scrum methodologies is a plus. Ability to manage multiple priorities and deliver high-quality results under tight deadlines. Disclaimer This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality - to help support you physically, financially and emotionally through the big milestones and in your everyday life. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 7-10 Years Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru South, Karnataka, India

On-site

Linkedin logo

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. As part of our diverse tech team, you can design, code and ship software that makes us an essential part of our customers’ digital lives. Here, you can work alongside talented data engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. American Express offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology on #TeamAmex. About American Express Technology Business Enablement: American Express Technology Business Enablement team enables us to transform Product Development practices through strategic frameworks, processes, tools and actionable insights. Job Description: As a Data Engineer, you will be responsible for designing, developing, and maintaining robust and scalable framework/services/application/pipelines for processing huge volume of data. You will work closely with cross-functional teams to deliver high-quality software solutions that meet our organizational needs. Key Responsibilities: Design and develop solutions using Bigdata tools and technologies like MapReduce, Hive, Spark etc. Extensive hands-on experience in object-oriented programming using Python, PySpark APIs etc. Experience in building data pipelines for huge volume of data. Experience in designing, implementing, and managing various ETL job execution flows. Experience in implementing and maintaining Data Ingestion process. Hands on experience in writing basic to advance level of optimized queries using HQL, SQL & Spark. Hands on experience in designing, implementing, and maintaining Data Transformation jobs using most efficient tools/technologies. Ensure the performance, quality, and responsiveness of solutions. Participate in code reviews to maintain code quality. Should be able to write shell scripts. Utilize Git for source version control. Set up and maintain CI/CD pipelines. Troubleshoot, debug, and upgrade existing application & ETL job chains. Required Skills and Qualifications: Bachelor’s degree in Computer Science Engineering, or a related field. Proven experience as Data Engineer or similar role. Strong proficiency in Object Oriented programming using Python. Experience with ETL jobs design principles. Solid understanding of HQL, SQL and data modeling Knowledge on Unix/Linux and Shell scripting principles. Familiarity with Git and version control systems. Experience with Jenkins and CI/CD pipelines. Knowledge of software development best practices and design patterns. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Hands-on experience with Google Cloud. We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Greetings from TCS! TCS is hiring for Pyspark developer Desired Experience Range: 4 to 8 Years Job Location: Chennai / Mumbai / Pune Key Responsibilities: Develop, optimize, and maintain big data pipelines using PySpark on distributed computing platforms. Design and implement ETL workflows for ingesting, processing, and transforming large datasets in Hive. Work with structured and unstructured data sources to ensure efficient data storage and retrieval. Optimize Hive queries and Spark jobs for performance, scalability, and cost efficiency. Implement best practices for data engineering, including data governance, security, and compliance. Monitor, troubleshoot, and enhance data workflows to ensure high availability and fault tolerance. Work with cloud platforms Azure and big data technologies to scale data solutions. Required Skills & Qualifications: Strong experience with PySpark for distributed data processing. Hands-on experience with Apache Hive and SQL-based data querying. Proficiency in Python and experience in working with large datasets. Familiarity with HDFS, Apache Hadoop, and distributed computing concepts. Good to have Knowledge of cloud-based data platforms like Azure Synapse, Data Bricks Understanding of performance tuning for Hive and Spark. Strong problem-solving and analytical skills. Thanks Anshika Show more Show less

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Skill : PySpark Developer Job Locations : Chennai, Pune Notice Period : Any Experience : 3-8 years Job Description : PySpark Developer Mandatory Skills : (Apache Spark, Big Data Hadoop Ecosystem, SparkSQL, Python) A good professional experience in Bigdata PySpark HIVE Hadoop PLSQL Good knowledge of AWS and Snowflake Good understanding of CICD and system design Candidate with prior experience working on technologies on Fund transfer AML knowledge will be an added advantage Excellent written and oral communications kills Self starter with quick learning abilities Multitask and should be able to work under stringent deadlines Ability to understand and work on various internal systems Ability to work with multiple stakeholders Interested candidates kindly share your updated resume to preethi.r@ltimindtree.com Show more Show less

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Skill : PySpark Developer Job Locations : Chennai, Pune Notice Period : Any Experience : 3-8 years Job Description : PySpark Developer Mandatory Skills : (Apache Spark, Big Data Hadoop Ecosystem, SparkSQL, Python) A good professional experience in Bigdata PySpark HIVE Hadoop PLSQL Good knowledge of AWS and Snowflake Good understanding of CICD and system design Candidate with prior experience working on technologies on Fund transfer AML knowledge will be an added advantage Excellent written and oral communications kills Self starter with quick learning abilities Multitask and should be able to work under stringent deadlines Ability to understand and work on various internal systems Ability to work with multiple stakeholders Interested candidates kindly share your updated resume to preethi.r@ltimindtree.com Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Title: Azure Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less

Posted 1 week ago

Apply

10.0 - 20.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Maddisoft has the following immediate opportunity, let us know if you or someone you know would be interested. Send in your resume ASAP. Send in resume along with LinkedIn profile without which applications will not be considered. Call us NOW! Job Title: Solution Architect Job Location: Hyderabad, India Responsibilities Interprets and delivers impactful strategic plans improving data integration, data quality, and data delivery in support of business initiatives and roadmaps Designs the structure and layout of data systems, including databases, warehouses, and lakes Selects and implements database management systems that meet the organizations needs by defining data schemas, optimizing data storage, and establishing data access controls and security measures Defines and implements the long-term technology strategy and innovations roadmaps across analytics, data engineering, and data platforms Designs and implements processes for the ETL process from various sources into the organizations data systems Translates high-level business requirements into data models and appropriate metadata, test data, and data quality standards Manages senior business stakeholders to secure strong engagement and ensures that the delivery of the project aligns with longer-term strategic roadmaps Simplifies the existing Solution Architecture, delivering reusable services and cost-saving opportunities in line with the policies and standards of the company Leads and participates in the peer review and quality assurance of project architectural artifacts across the EA group through governance forums Defines and manages standards, guidelines, and processes to ensure data quality Works with IT teams, business analysts, and data analytics teams to understand data consumers needs and develop solutions Evaluates and recommends emerging technologies for data management, storage, and analytics Job Requirements Bachelor's degree in Computer Science, Information Sciences or related discipline and 5 - 8 years of relevant experience (ex: IT solutions architecture, enterprise architecture, and systems & application design) or 12 -15 years or related experience Broad technical expertise in at least one area, such as application development, enterprise applications or IT systems engineering Excellent communications skills - Able to effectively communicate highly technical information in non-technical terminology (written and verbal) Expert in change management principles associated with new technology implementations Deep understanding of project management principles Preferred Qualifications Strong understanding of Azure cloud services Develop and maintain strong relationships with various business areas and IT Teams to understand their needs and challenges. Proactively identify opportunities for collaboration and engagement across IT Teams. At least five years of relevant experience in design and implementation of data models for enterprise data warehouse initiatives Experience leading projects involving data warehousing, data modeling, and data analysis Design experience in Azure Databricks, PySpark, and Power BI/Tableau Strong ability in programming languages such as Java, Python, and C/C++ Ability in data science languages/tools such as SQL, R, SAS, or Excel Proficiency in the design and implementation of modern Solution Architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks) Experience with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata Understanding of entity-relationship modeling, metadata systems, and data quality tools and techniques Ability to think strategically and relate architectural decisions and recommendations to business needs and client culture Ability to assess traditional and modern Solution Architecture components based on business needs Experience with business intelligence tools and technologies such as ETL, Power BI, and Tableau Ability to regularly learn and adopt new technology, especially in the ML/AI realm Strong analytical and problem-solving skills Ability to synthesize and clearly communicate large volumes of complex information to senior management of various technical understandings Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders Ability to guide solution design and architecture to meet business needs.

Posted 1 week ago

Apply

5.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Manage the end-to-end RTB process and relevant activities across all of Data Analytics systems per the prescribed HSBC and ITIL standards. Own and manage Incidents, Problem records and Change management, per HSBC ITIL standards. Managing major issues (P1 in nature) and took them to closure. Handling and improving Batch monitoring, issue resolution and Service requests. Manage the implementation of the Disaster Recovery and role-swap of various Production systems. Managing production releases and deployment. Monitor and report on all aspects of Production processing (Incidents, Problems, Service Level indicators etc.), using automated dashboards. Work closely with the Engineering / product team to ensure automation of the Production processes. Promote Site Reliability Engineering culture. Adopt and follow the Agile and DevOps principles. Communicate effectively with various Business & IT teams. Collaborate effectively with global members to achieve desired outcomes. Work during India office hours (9 AM to 6 PM) but provide 24*7 on-call support out of office hours on rotation / need basis. Ensure adherence to different SLAs. Understand monitoring tools and evolve the tooling to improve the Operations teams’ ability to proactively deal with production issues. Should comply with audit, data protection & IT Security procedures to ensure integrity of system. Should be open to learn new technologies based on project requirements. Requirements To be successful in this role, you should meet the following requirements: Awareness of master data management patterns and principles. Knowledge of Hadoop Applications, IBM’s Master Data Management platform, WebSphere, Document management etc. Experience with Big Data (Hadoop) principles is preferred. Knowledge and understanding of Banking domain and IT Infrastructure is preferred Sound knowledge of PL/SQL, Oracle, Unix, shell scripting, Scheduling tools like Control-M Proven experience of working in RTB / Production support teams. Hands-on experience and sound understanding of HSBC and ITIL, processes and standards. Good communication skills and be a team player of global team. Ability to prioritise and work independently within a diverse team environment. Open to work in shifts / non-regular hours as required. Knowledge on technologies running Java based applications in LINUX/UNIX environment. 5-6 years of overall IT experience with Java, PLSQL, Unix, Big Data, Qlik View, BI Tools (Qlik and Power BI), Hadoop etc. Knowledge of Data analysis. Python and Pyspark is added advantage. Experience with knowledge of Agile methodology and exposure to project management tools like Jira, Confluence, GITHUB is a plus You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Ability to convert business problem to an analytical problem and then finding pertinent solutions Providing high-quality analysis and recommendations to business problems. Efficient project management and delivery Ability to conceptualize data driven solutions for the business problem at hand for multiple businesses/region to facilitate efficient decision making Focus on driving efficiency gains and enhancement of processes. Use of data to improve customer outcomes through the provision of insight and challenge Also drive business benefit through self-initiated projects Requirements To be successful in this role, you should meet the following requirements: Sound knowledge of Python, PySpark, SQL, Excel, Hadoop and Hive, Big data understanding is added advantage. Experience with common project management tools such as Jira and Confluence Experience of coaching/leading more junior team members An understanding of big data and how data pipelines are established and maintained An understanding of data assets and an experience with at least one common data visualization tool Experience working with colleagues in other regions such as APAC and Europe Strong risk assessment skills to identify upcoming risks and issues to a project delivery Awareness of data management such as data sharing across borders and data quality guidelines A general awareness of banking products to both retail and commercial clients Display a leadership complex and act as role model for all colleagues Effective and clear communication to stakeholders Proven track record of working under pressure and juggling multiple priorities You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Karnataka, India

On-site

Linkedin logo

Who You’ll Work With This role is part of the Nike’s Content Technology team within Consumer Product and Innovation (CP&I) organization, working very closely with the globally distributed Engineering and Product teams. This role will roll up to the Director Software Engineering based out of Nike India Tech Centre. Who We Are Looking For We are looking for experienced Technology focused and hands on Lead Engineer to join our team in Bengaluru, India. As a Senior Data Engineer, you will play a key role in ensuring that our data products are robust and capable of supporting our Data Engineering and Business Intelligence initiatives. A data engineer with 2+ years of experience in data engineering. Proficient in SQL, Python, PySpark, and Apache Airflow (or similar workflow management tools). Hands-on experience with Databricks, Snowflake, and cloud platforms (AWS/GCP/Azure). Good understanding of Spark, Delta Lake, Medallion architecture, and ETL/ELT processes. Solid data modeling and data profiling skills. Familiarity with Agile methodologies (Scrum/Kanban). Awareness of DevOps practices in data engineering (automated testing, security administration, workflow orchestration) Exposure to Kafka or real-time data processing Strong communication and collaboration skills. Preferred: familiarity with Tableau or similar BI tools exposure to GenAI/ML pipelines Nice to have: Databricks certifications for data engineer, developer, or Apache Spark. What You’ll Work On Build and maintain ETL/ELT pipelines and reusable data components. Collaborate with peers and stakeholders to gather data requirements. Participate in code reviews and contribute to quality improvements. Monitor and troubleshoot data pipelines for performance and reliability. Support CI/CD practices in data engineering workflows. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics - D and A – Azure Databricks - Senior We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop & deploy azure databricks in a cloud environment using Azure Cloud services ETL design, development, and deployment to Cloud Service Interact with Onshore, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success 3 to 7 years of Experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, and data warehouse solutions Extensive hands-on experience implementing data migration and data processing using Azure services: Databricks, ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, etc. Hands on experience in programming like python/pyspark Need to have good knowledge on DWH concepts and implementation knowledge on Snowflake Well versed in DevOps and CI/CD deployments Must have hands on experience in SQL and procedural SQL languages Strong analytical skills and enjoys solving complex technical problems To qualify for the role, you must have Be a computer science graduate or equivalent with 3 to 7years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Strong analytical skills and enjoys solving complex technical problems Proficiency in Software Development Best Practices Excellent debugging and optimization skills Experience in Enterprise grade solution implementations & in converting business problems/challenges to technical solutions considering security, performance, scalability etc Excellent communicator (written and verbal formal and informal). Participate in all aspects of solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Client management skills Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics - D and A – Azure Databricks - Senior We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop & deploy azure databricks in a cloud environment using Azure Cloud services ETL design, development, and deployment to Cloud Service Interact with Onshore, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success 3 to 7 years of Experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, and data warehouse solutions Extensive hands-on experience implementing data migration and data processing using Azure services: Databricks, ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, etc. Hands on experience in programming like python/pyspark Need to have good knowledge on DWH concepts and implementation knowledge on Snowflake Well versed in DevOps and CI/CD deployments Must have hands on experience in SQL and procedural SQL languages Strong analytical skills and enjoys solving complex technical problems To qualify for the role, you must have Be a computer science graduate or equivalent with 3 to 7years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Strong analytical skills and enjoys solving complex technical problems Proficiency in Software Development Best Practices Excellent debugging and optimization skills Experience in Enterprise grade solution implementations & in converting business problems/challenges to technical solutions considering security, performance, scalability etc Excellent communicator (written and verbal formal and informal). Participate in all aspects of solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Client management skills Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics - D and A – Azure Databricks - Senior We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop & deploy azure databricks in a cloud environment using Azure Cloud services ETL design, development, and deployment to Cloud Service Interact with Onshore, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success 3 to 7 years of Experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, and data warehouse solutions Extensive hands-on experience implementing data migration and data processing using Azure services: Databricks, ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, etc. Hands on experience in programming like python/pyspark Need to have good knowledge on DWH concepts and implementation knowledge on Snowflake Well versed in DevOps and CI/CD deployments Must have hands on experience in SQL and procedural SQL languages Strong analytical skills and enjoys solving complex technical problems To qualify for the role, you must have Be a computer science graduate or equivalent with 3 to 7years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Strong analytical skills and enjoys solving complex technical problems Proficiency in Software Development Best Practices Excellent debugging and optimization skills Experience in Enterprise grade solution implementations & in converting business problems/challenges to technical solutions considering security, performance, scalability etc Excellent communicator (written and verbal formal and informal). Participate in all aspects of solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Client management skills Ideally, you’ll also have Experience on Banking and Capital Markets domains Skills And Attributes For Success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Advisory practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

16 - 25 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Greetings from Accion Labs !!! We are looking for a Sr Data Engineer Location : Bangalore , Mumbai , Pune, Hyderabad, Noida Experience : 5+ years Notice Period : Immediate Joiners/ 15 Days Any references would be appreciated !!! Job Description / Skill set: Python/Spark/PySpark/Pandas SQL AWS EMR/Glue/S3/RDS/Redshift/Lambda/SQS/AWS Step Function/EventBridge

Posted 1 week ago

Apply

5.0 - 10.0 years

16 - 31 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Greetings from Accion Labs !!! We are looking for a Sr Data Engineer Location : Bangalore , Mumbai , Pune, Hyderabad, Noida Experience : 5+ years Notice Period : Immediate Joiners/ 15 Days Any references would be appreciated !!! Job Description / Skill set: Python/Spark/PySpark/Pandas SQL AWS EMR/Glue/S3/RDS/Redshift/Lambda/SQS/AWS Step Function/EventBridge Real - time analytics

Posted 1 week ago

Apply

6.0 - 10.0 years

20 - 30 Lacs

Pune, Ahmedabad, Mumbai (All Areas)

Hybrid

Naukri logo

Must be an expert in SQL, Data Lake, Azure Data Factory, Azure Synapse, ETL, Databricks Must be an expert in data modeling, writing complex queries in SQL, Stored procedures / PySpark. Notebooks for handling complex data transformation Required Candidate profile - Ability to convert SQL code to PySpark - Strong programming skills in languages such as SQL, Python - Exp with data modelling, data warehousing & dimensional modelling concept

Posted 1 week ago

Apply

8.0 - 12.0 years

25 - 30 Lacs

Chennai

Work from Office

Naukri logo

Job description Job Title: Manager Data Engineer - Azure Location: Chennai (On-site) Experience: 8 - 12 years Employment Type: Full-Time About the Role We are seeking a highly skilled Senior Azure Data Solutions Architect to design and implement scalable, secure, and efficient data solutions supporting enterprise-wide analytics and business intelligence initiatives. You will lead the architecture of modern data platforms, drive cloud migration, and collaborate with cross-functional teams to deliver robust Azure-based solutions. Key Responsibilities Architect and implement end-to-end data solutions using Azure services (Data Factory, Databricks, Data Lake, Synapse, Cosmos DB Design robust and scalable data models, including relational, dimensional, and NoSQL schemas. Develop and optimize ETL/ELT pipelines and data lakes using Azure Data Factory, Databricks, and open formats such as Delta and Iceberg. Integrate data governance, quality, and security best practices into all architecture designs. Support analytics and machine learning initiatives through structured data pipelines and platforms. Collaborate with data engineers, analysts, data scientists, and business stakeholders to align solutions with business needs. Drive CI/CD integration with Databricks using Azure DevOps and tools like DBT. Monitor system performance, troubleshoot issues, and optimize data infrastructure for efficiency and reliability. Stay current with Azure platform advancements and recommend improvements. Required Skills & Experience Extensive hands-on experience with Azure services: Data Factory, Databricks, Data Lake, Azure SQL, Cosmos DB, Synapse. • Expertise in data modeling and design (relational, dimensional, NoSQL). • Proven experience with ETL/ELT processes, data lakes, and modern lakehouse architectures. • Proficiency in Python, SQL, Scala, and/or Java. • Strong knowledge of data governance, security, and compliance frameworks. • Experience with CI/CD, Azure DevOps, and infrastructure as code (Terraform or ARM templates). • Familiarity with BI and analytics tools such as Power BI or Tableau. • Excellent communication, collaboration, and stakeholder management skills. • Bachelors degree in Computer Science, Engineering, Information Systems, or related field. Preferred Qualifications • Experience in regulated industries (finance, healthcare, etc.). • Familiarity with data cataloging, metadata management, and machine learning integration. • Leadership experience guiding teams and presenting architectural strategies to leadership. Why Join Us? • Work on cutting-edge cloud data platforms in a collaborative, innovative environment. • Lead strategic data initiatives that impact enterprise-wide decision-making. • Competitive compensation and opportunities for professional growth.

Posted 1 week ago

Apply

5.0 - 8.0 years

14 - 18 Lacs

Chennai

Work from Office

Naukri logo

Job description Job Title: Lead Data Engineer - Azure | | GeakMinds | Chennai Location: Chennai (On-site) Experience: 5 - 8 years Employment Type: Full-Time About the Role We are seeking a highly skilled Senior Azure Data Solutions Architect to design and implement scalable, secure, and efficient data solutions supporting enterprise-wide analytics and business intelligence initiatives. You will lead the architecture of modern data platforms, drive cloud migration, and collaborate with cross-functional teams to deliver robust Azure-based solutions. Key Responsibilities Architect and implement end-to-end data solutions using Azure services (Data Factory, Databricks, Data Lake, Synapse, Cosmos DB Design robust and scalable data models, including relational, dimensional, and NoSQL schemas. Develop and optimize ETL/ELT pipelines and data lakes using Azure Data Factory, Databricks, and open formats such as Delta and Iceberg. Integrate data governance, quality, and security best practices into all architecture designs. Support analytics and machine learning initiatives through structured data pipelines and platforms. Collaborate with data engineers, analysts, data scientists, and business stakeholders to align solutions with business needs. Drive CI/CD integration with Databricks using Azure DevOps and tools like DBT. Monitor system performance, troubleshoot issues, and optimize data infrastructure for efficiency and reliability. Stay current with Azure platform advancements and recommend improvements. Required Skills & Experience Extensive hands-on experience with Azure services: Data Factory, Databricks, Data Lake, Azure SQL, Cosmos DB, Synapse. • Expertise in data modeling and design (relational, dimensional, NoSQL). • Proven experience with ETL/ELT processes, data lakes, and modern lakehouse architectures. • Proficiency in Python, SQL, Scala, and/or Java. • Strong knowledge of data governance, security, and compliance frameworks. • Experience with CI/CD, Azure DevOps, and infrastructure as code (Terraform or ARM templates). • Familiarity with BI and analytics tools such as Power BI or Tableau. • Excellent communication, collaboration, and stakeholder management skills. • Bachelors degree in Computer Science, Engineering, Information Systems, or related field. Preferred Qualifications • Experience in regulated industries (finance, healthcare, etc.). • Familiarity with data cataloging, metadata management, and machine learning integration. • Leadership experience guiding teams and presenting architectural strategies to leadership. Why Join Us? • Work on cutting-edge cloud data platforms in a collaborative, innovative environment. • Lead strategic data initiatives that impact enterprise-wide decision-making. • Competitive compensation and opportunities for professional growth.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com Job Description We are looking for an experienced Senior Data Engineer with a strong foundation in Python, SQL, and Spark , and hands-on expertise in AWS, Databricks . In this role, you will build and maintain scalable data pipelines and architecture to support analytics, data science, and business intelligence initiatives. You’ll work closely with cross-functional teams to drive data reliability, quality, and performance. Responsibilities Design, develop, and optimize scalable data pipelines using Databricks in AWS such as Glue, S3, Lambda, EMR, Databricks notebooks, workflows and jobs. Building data lake in WS Databricks. Build and maintain robust ETL/ELT workflows using Python and SQL to handle structured and semi-structured data. Develop distributed data processing solutions using Apache Spark or PySpark. Partner with data scientists and analysts to provide high-quality, accessible, and well-structured data. Ensure data quality, governance, security, and compliance across pipelines and data stores. Monitor, troubleshoot, and improve the performance of data systems and pipelines. Participate in code reviews and help establish engineering best practices. Mentor junior data engineers and support their technical development. Qualifications Requirements Bachelor's or master's degree in computer science, Engineering, or a related field. 5+ years of hands-on experience in data engineering, with at least 2 years working with AWS Databricks. Strong programming skills in Python for data processing and automation. Advanced proficiency in SQL for querying and transforming large datasets. Deep experience with Apache Spark/PySpark in a distributed computing environment. Solid understanding of data modelling, warehousing, and performance optimization techniques. Proficiency with AWS services such as Glue, S3, Lambda and EMR. Experience with version control Git or Code commit Experience in any workflow orchestration like Airflow, AWS Step funtions is a plu Show more Show less

Posted 1 week ago

Apply

9.0 - 14.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Applications Development Senior Programmer Analyst (data engineering senior programmer analyst) is an intermediate level position responsible for participation in the establishment and implementation of new or revised data platform eco systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to data engineering scrum team to implement the business requirements: Responsibilities: Build and maintain batch or real-time data pipelines in data platform. Maintain and optimize the data infrastructure required for accurate extraction, transformation, and loading of data from a wide variety of data sources. Develop ETL (extract, transform, load) processes to help extract and manipulate data from multiple sources. Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Automate data workflows such as data ingestion, aggregation, and ETL processing. Prepare raw data in Data Warehouses into a consumable dataset for both technical and non-technical stakeholders. Build, maintain, and deploy data products for analytics and data science teams on data platform Ensure data accuracy, integrity, privacy, security, and compliance through quality control procedures. Monitor data systems performance and implement optimization solution. Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Serve as advisor or coach to new or lower level analysts Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 9 to 14 years of relevant experience in Data engineering role Advanced SQL/ RDBMS skills and experience with relational databases and database design. Strong proficiency in object-oriented languages: Python, PySpark is must Experience working with Bigdata - Hive/Impala/S3/HDFS Experience working with data ingestion tools such as Talend or Ab Initio. Nice to working with data lakehouse architecture such as AWS Cloud/Airflow/Starburst/Iceberg Strong proficiency in scripting languages like Bash, UNIX Shell scripting Strong proficiency in data pipeline and workflow management tools Strong project management and organizational skills. Excellent problem-solving, communication, and organizational skills. Proven ability to work independently and with a team. Experience in managing and implementing successful projects Ability to adjust priorities quickly as circumstances dictate Consistently demonstrates clear and concise written and verbal communication Education: Bachelor’s degree/University degree or equivalent experience ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potentiaL Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential. The Team Deloitte’s practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Learn more about Analytics and Information Management Practice . Job Title: Consultant Data ScientistReports To: Head of Data Science (Marketing) Division: Digital Data & Analytics Role Purpose The Data Science (Marketing) team is responsible for developing cutting-edge analytics and data science solutions that enable the design and execution of personalized customer communications across a range of channels. This includes both owned and paid media and covers existing customers and prospects. As a Lead Data Scientist , you will collaborate with business stakeholders, data scientists, data engineers, platform engineering, and architecture teams to drive business outcomes and advance Marketing's data science capabilities. A key focus will be building predictive models and event/trigger-based audiences by leveraging customer transaction data, behavioural signals, and demographics across online and offline sources. Key Responsibilities Lead the application of advanced analytics and machine learning across Marketing to drive strategic, data-informed decisions. Design, build, and deploy predictive models and trigger-based audiences for use in Marketing communications across multiple media channels. Enhance and manage feature engineering processes and feature store development for Marketing use cases. Integrate digital analytics (e.g., Adobe Analytics) into data science models and use cases. Deliver outcomes aligned with the AI/ML roadmap and architecture capabilities. Continuously identify, promote, and embed new advanced analytics techniques within the team and across the Marketing function. Present complex analytical findings in a clear and actionable manner using visualisation tools and insight presentations tailored to senior stakeholders. Develop innovative analytic approaches to uncover unmet customer needs and improve financial wellbeing. Champion standardisation and reusability of code, features, and solutions to improve time to insight. Key Performance Indicators (KPIs) Timely and high-quality delivery of data science solutions aligned to business goals. Recognition as a key contributor to innovation and the development of Marketing analytics capabilities. Measurable improvements in insight delivery timelines through standardized and reusable frameworks. Strong stakeholder relationships; known as a trusted advisor within the business. Essential Capabilities Passion for analytics and data science, with awareness of the latest industry trends. Proven ability to solve business problems using data and analytical techniques. Strong strategic and customer-centric mindset. Ability to communicate complex concepts clearly to non-technical stakeholders. Strong problem-solving, documentation, and analytical thinking skills. Self-management and accountability for personal tasks and project deliverables. Collaboration and influencing skills across cross-functional teams. Essential Experience 10+ years of experience in data science, analytics, or related fields. Expertise in building, deploying, and operationalising machine learning models. Proficiency in SQL , Python , PySpark , and Spark ML . Hands-on experience with cloud platforms such as AWS (preferred), Azure, or GCP. Skilled in using GitHub for version control and collaboration. Strong experience in working with both structured and unstructured datasets. Familiarity with Auto ML platforms such as Data Robot or H2O AI . Deep understanding of customer and digital data, and their use in marketing and communications. Experience with agile delivery methodologies and collaboration tools. Desirable Experience Leadership in the use of AI/ML and analytics to guide Marketing strategy. Input into development of AI roadmaps and alignment with architecture/platform capabilities. Ownership of continuous improvement in analytics tools, techniques, and adoption. Skilled in visualising data using tools such as Tableau, Power BI, or similar. Experience in developing innovative solutions that address both met and unmet customer needs. Qualification Requirements Bachelor’s or postgraduate degree in Statistics , Mathematics , Computer Science , Economics , Engineering , or related disciplines. Key Stakeholder Groups Principal Data Scientist Product Owners across the Marketing domain Data Engineering (Marketing) Our purpose Deloitte is led by a purpose: To make an impact tha t matters. Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world. Show more Show less

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potentiaL Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential. The Team Deloitte’s practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Learn more about Analytics and Information Management Practice. As a Data Engineer , you will bring extensive expertise on data handling and curation capabilities to the team. You’ll be responsible for building intelligent domains using market leading tools, ultimately improving the way we work in Marketing. Experience It is expected that the role holder will most likely have the following qualifications and experience 5-9 years In Data Engineering, software development such as ELT/ETL, data extraction and manipulation in Data Lake/Data Warehouse environment Expert level Hands to the following: Python, SQL PySpark DBT and Apache Airflow Postgres/others RDBMS DevOps, Jenkins, CI/CD Data Governance and Data Quality frameworks Data Lakes, Data Warehouse AWS services including S3, SNS, SQS, Lambda, EMR, Glue, Athena, EC2, VPC etc. Source code control - GitHub, VSTS etc. Key Tasks, Accountabilities and Challenges of this role: Design, develop, test, deploy, maintain and improve software Preparing and maintaining systems and program documentation. Assisting in the analysis and development of applications programs and databases. Modifying and troubleshooting applications programs. Coaching, mentoring, and guiding junior developer engineers Provide key support on fail and fix for assigned application/s. Undertake complex testing activities in relation to software solution Our purpose Deloitte is led by a purpose: To make an impact tha t matters. Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Consultant, Performance Analytics, Advisors & Consulting Services Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Client Impact Provide creative input on projects across a range of industries and problem statements Contribute to the development of analytics strategies and programs for regional and global clients by leveraging data and technology solutions to unlock client value Collaborate with Mastercard team to understand clients’ needs, agenda, and risks Develop working relationship with client analysts/managers, and act as trusted and reliable partner Team Collaboration & Culture Collaborate with senior project delivery consultants to identify key findings, prepare effective presentations, and deliver recommendations to clients Independently identify trends, patterns, issues, and anomalies in defined area of analysis, and structure and synthesize own analysis to highlight relevant findings Lead internal and client meetings, and contribute to project management Contribute to the firm's intellectual capital Receive mentorship from performance analytics leaders for professional growth and development Qualifications Basic qualifications Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience managing clients or internal stakeholders Ability to analyze large datasets and synthesize key findings Proficiency using data analytics software (e.g., Python, R, SQL, SAS) Advanced Word, Excel, and PowerPoint skills Ability to perform multiple tasks with multiple clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred Qualifications Additional data and analytics experience in building, managing, and maintaining database structures, working with data visualization tools (e.g., Tableau, Power BI), or working with Hadoop framework and coding using Impala, Hive, or PySpark Ability to analyze large datasets and synthesize key findings to provide recommendations via descriptive analytics and business intelligence Experience managing tasks or workstreams in a collaborative team environment Ability to identify problems, brainstorm and analyze answers, and implement the best solutions Relevant industry expertise Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-249734 Show more Show less

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies