Jobs
Interviews

97 Data Lakes Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

Embark upon a transformative journey as a Solutions Architect. At Barclays, you don't just embrace change you drive it. As a Solutions Architect, you will design, develop, and implement solutions to complex business problems. You will collaborate with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs while balancing technology risks against business delivery and driving consistency. To be a successful Solutions Architect, you should have experience in designing and building highly scalable and highly resilient global scale financial systems in a highly regulated environment. You should have a proven track record of delivering solutions and roadmaps for small, medium, and large complex business and technical projects of strategic significance. Experience in owning end-to-end technical and application architecture, current and target states, as well as working with relevant business and technical component teams is essential. Additionally, experience in DevOps operating model and tools, technical expertise in Java or other programming languages, data platforms, BI visualization, modern architecture patterns, Cloud capabilities, and hands-on experience in architecting cloud solutions are required. Exposure to Service-Oriented Architecture design principles, integration and implementation issues, and knowledge of technologies used by financial service providers and in banking organizations are important. You should have the ability to multi-task, handle solutions related to multiple projects and stakeholders simultaneously, and manage competing priorities against demanding timelines. Experience working with senior stakeholders and relevant certifications such as TOGAF or BCS accreditation are desired. Additional skills in banking applications and infrastructure, understanding of project lifecycles, major phases, and different project methodologies are highly valued. The role is based in Pune. **Purpose of the role:** To design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements and driving consistency in technology risks against business delivery. **Accountabilities:** - Design and develop solutions as products that can evolve to meet business requirements aligned with modern software engineering practices. - Implement technologies and platforms for targeted design activities that maximize the benefit of cloud capabilities. - Incorporate security principles in best practice designs to meet the Bank's resiliency expectations. - Balance risks and controls to deliver agreed business and technology value. - Adopt standardized solutions or contribute to their evolution where appropriate. - Provide support for fault finding and performance issues to operational support teams. - Assess solution design impact in terms of risk, capacity, and cost impact. **Vice President Expectations:** - Advise key stakeholders and demonstrate leadership in managing risk and strengthening controls. - Collaborate with other areas of work and contribute to achieving the goals of the business. - Create solutions based on sophisticated analytical thought and innovative problem-solving. - Build and maintain trusting relationships with internal and external stakeholders to achieve key business objectives. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset of Empower, Challenge, and Drive.,

Posted 1 day ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Quality Engineer, your primary responsibility will be to analyze business and technical requirements to design, develop, and execute comprehensive test plans for ETL pipelines and data transformations. You will perform data validation, reconciliation, and integrity checks across various data sources and target systems. Additionally, you will be expected to build and automate data quality checks using SQL and/or Python scripting. It will be your duty to identify, document, and track data quality issues, anomalies, and defects. Collaboration is key in this role, as you will work closely with data engineers, developers, QA, and business stakeholders to understand data requirements and ensure that data quality standards are met. You will define data quality KPIs and implement continuous monitoring frameworks. Participation in data model reviews and providing input on data quality considerations will also be part of your responsibilities. In case of data discrepancies, you will be expected to perform root cause analysis and work with teams to drive resolution. Ensuring alignment to data governance policies, standards, and best practices will also fall under your purview. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Additionally, you should have 4 to 7 years of experience as a Data Quality Engineer, ETL Tester, or a similar role. A strong understanding of ETL concepts, data warehousing principles, and relational database design is essential. Proficiency in SQL for complex querying, data profiling, and validation tasks is required. Familiarity with data quality tools, testing methodologies, and modern cloud data ecosystems (AWS, Snowflake, Apache Spark, Redshift) will be advantageous. Moreover, advanced knowledge of SQL, data pipeline tools like Airflow, DBT, or Informatica, as well as experience with integrating data validation processes into CI/CD pipelines using tools like GitHub Actions, Jenkins, or similar, are desired qualifications. An understanding of big data platforms, data lakes, non-relational databases, data lineage, master data management (MDM) concepts, and experience with Agile/Scrum development methodologies will be beneficial for excelling in this role. Your excellent analytical and problem-solving skills along with a strong attention to detail will be valuable assets in fulfilling the responsibilities of a Data Quality Engineer.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Data Engineer at GlobalLogic, you will be responsible for architecting, building, and maintaining complex ETL/ELT pipelines for batch and real-time data processing using various tools and programming languages. Your key duties will include optimizing existing data pipelines for performance, cost-effectiveness, and reliability, as well as implementing data quality checks, monitoring, and alerting mechanisms to ensure data integrity. Additionally, you will play a crucial role in ensuring data security, privacy, and compliance with relevant regulations such as GDPR and local data laws. To excel in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. Excellent analytical, problem-solving, and critical thinking skills with meticulous attention to detail are essential. Strong communication (written and verbal) and interpersonal skills are also required, along with the ability to collaborate effectively with cross-functional teams. Experience with Agile/Scrum development methodologies is considered a plus. Your responsibilities will involve providing technical leadership and architecture by designing and implementing robust, scalable, and efficient data architectures that align with organizational strategy and future growth. You will define and enforce data engineering best practices, evaluate and recommend new technologies, and oversee the end-to-end data development lifecycle. As a leader, you will mentor and guide a team of data engineers, conduct code reviews, provide feedback, and promote a culture of engineering excellence. You will collaborate closely with data scientists, data analysts, software engineers, and business stakeholders to understand data requirements and translate them into technical solutions. Your role will also involve communicating complex technical concepts and data strategies effectively to both technical and non-technical audiences. At GlobalLogic, we offer a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust environment. By joining our team, you will have the chance to work on impactful projects, engage your curiosity and problem-solving skills, and contribute to shaping cutting-edge solutions that redefine industries. With a commitment to integrity and trust, GlobalLogic provides a safe, reliable, and ethical global environment where you can thrive both personally and professionally.,

Posted 2 days ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

As an experienced professional with 5 to 10 years of experience in the field of information technology, you will be responsible for creating data models for corporate analytics in compliance with standards, ensuring usability and conformance across the enterprise. Your role will involve developing data strategies, ensuring vocabulary consistency, and managing data transformations through intricate analytical relationships and access paths, including data mappings at the data-field level. Collaborating with Product Management and Business stakeholders, you will identify and evaluate data sources necessary to achieve project and business objectives. Working closely with Tech Leads and Product Architects, you will gain insights into end-to-end data implications, data integration, and the functioning of business systems. Additionally, you will collaborate with DQ Leads to address data integrity improvements and quality resolutions at the source. This role requires domain knowledge in supply chain, retail, or inventory management. The critical skills needed for this position include a strong understanding of various software platforms and development technologies, proficiency in SQL, RDBMS, Data Lakes, and Warehouses, and knowledge of the Hadoop ecosystem, Azure, ADLS, Kafka, Apache Delta, and Databricks/Spark. Experience with data modeling tools like ERStudio or Erwin would be advantageous. Effective collaboration with Product Managers, Technology teams, and Business Partners, along with familiarity with Agile and DevOps techniques, is essential. Excellent communication skills, both written and verbal, are also key for success in this role. Preferred qualifications for this position include a bachelor's degree in business information technology, computer science, or a related discipline. This is a full-time position located in Bangalore, Bengaluru, Delhi, Kolkata, or Navi Mumbai. If you meet these requirements and are interested in this opportunity, please apply online. The digitalxnode evaluation team will review your resume, and if your profile is selected, they will reach out to you for further steps. We will retain your information in our database for future job openings.,

Posted 2 days ago

Apply

15.0 - 19.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for a highly skilled and experienced Data Architect to join our team. With at least 15 years of experience in Data engineering and Analytics, the ideal candidate will have a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will play a key role in designing, creating, deploying, and managing Blackbaud's data architecture. This position holds significant technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. You will act as an evangelist for proper data strategy across various teams within Blackbaud, and provide technical guidance, particularly in the area of data, for other projects. Responsibilities: - Develop and direct the strategy for all aspects of Blackbaud's Data and Analytics platforms, products, and services. - Set, communicate, and facilitate technical direction for the AI Center of Excellence and beyond collaboratively. - Design and develop innovative products, services, or technological advancements in the Data Intelligence space to drive business expansion. - Collaborate with product management to create technical solutions that address customer business challenges. - Take ownership of technical data governance practices to ensure data sovereignty, privacy, security, and regulatory compliance. - Challenge existing practices and drive innovation in the data space. - Create a data access strategy to securely democratize data and support research, modeling, machine learning, and artificial intelligence initiatives. - Contribute to defining tools and pipeline patterns used by engineers and data engineers for data transformation and analytics support. - Work within a cross-functional team to translate business requirements into data architecture solutions. - Ensure that data solutions prioritize performance, scalability, and reliability. - Mentor junior data architects and team members. - Stay updated on technology trends such as distributed computing, big data concepts, and architecture. - Advocate internally for the transformative power of data at Blackbaud. Required Qualifications: - 15+ years of experience in data and advanced analytics. - Minimum of 8 years of experience with data technologies in Azure/AWS. - Proficiency in SQL and Python. - Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. - Familiarity with Databricks, Microsoft Fabric. - Strong grasp of data modeling, data warehousing, data lakes, data mesh, and data products. - Experience with machine learning. - Excellent communication and leadership abilities. Preferred Qualifications: - Experience with .Net/Java and Microservice Architecture.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

The ideal candidate for this position should have 8-12 years of experience and possess a strong understanding and hands-on experience with Microsoft Fabric. You will be responsible for designing and implementing end-to-end data solutions on Microsoft Azure, which includes data lakes, data warehouses, and ETL/ELT processes. Your role will involve developing scalable and efficient data architectures to support large-scale data processing and analytics workloads. Ensuring high performance, security, and compliance within Azure data solutions will be a key aspect of this role. You should have knowledge of various techniques such as lakehouse and warehouse, along with experience in implementing them. Additionally, you will be required to evaluate and select appropriate Azure services like Azure SQL Database, Azure Synapse Analytics, Azure Data Lake Storage, Azure Databricks, Unity Catalog, and Azure Data Factory. Deep knowledge and hands-on experience with these Azure Data Services are essential. Collaborating closely with business and technical teams to understand and translate data needs into robust and scalable data architecture solutions will be part of your responsibilities. You should also have experience in data governance, data privacy, and compliance requirements. Excellent communication and interpersonal skills are necessary for effective collaboration with cross-functional teams. In this role, you will provide expertise and leadership to the development team implementing data engineering solutions. Working with Data Scientists, Analysts, and other stakeholders to ensure data architectures align with business goals and data analysis requirements is crucial. Optimizing cloud-based data infrastructure for performance, cost-effectiveness, and scalability will be another key responsibility. Experience in programming languages like SQL, Python, and Scala is required. Hands-on experience with MS SQL Server, Oracle, or similar RDBMS platforms is preferred. Familiarity with Azure DevOps and CI/CD pipeline development is beneficial. An in-depth understanding of database structure principles and distributed data processing of big data batch or streaming pipelines is essential. Knowledge of data visualization tools such as Power BI and Tableau, along with data modeling and strong analytics skills is expected. The candidate should be able to convert OLTP data structures into Star Schema and ideally have DBT experience along with data modeling experience. A problem-solving attitude, self-motivation, attention to detail, and effective task prioritization are essential qualities for this role. At Hitachi, attitude and aptitude are highly valued as collaboration is key. While not all skills are required, experience with Azure SQL Data Warehouse, Azure Data Factory, Azure Data Lake, Azure Analysis Services, Databricks/Spark, Python or Scala, data modeling, Power BI, and database migration are desirable. Designing conceptual, logical, and physical data models using tools like ER Studio and Erwin is a plus.,

Posted 2 days ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job description 5+ years of proven work experience in data modelling related projects as a Data Modeler. Understand and translate business needs into data models supporting long-term solutions. Ability to understand data relationships and can design data models that reflects these relationships and facilitates efficient ingestion, processing and consumption of Data. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Experience working with databases including OLAP/OLTP based data modeling. Perform reverse engineering of physical data models from databases and SQL scripts. Analyze data-related system integration challenges and propose appropriate solutions. Experience in market leading cloud platforms such as Google Cloud Platform (GCP) and Amazon Web Services (AWS). Experience working on 3rd normal forms. Excellent problem solving and communication skills; experience in interacting with technical and non-technical stakeholders at all levels. Bachelor&aposs degree in computer science, information technology or equivalent work experience. Show more Show less

Posted 2 days ago

Apply

10.0 - 12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At CoffeeBeans Consulting, we are a dynamic software consulting firm helping clients solve complex problems through innovative technology solutions. Were expanding our Data & AI practice and looking for a Lead Consultant Data & AI to help us build and deliver high-impact data and AI projects for our clients. Role Overview: Were looking for a hands-on technical leader who combines deep expertise in AI/ML (traditional ML and GenAI) with strong skills in modern data engineering and architecture . This role is: Hands-on engineering and architecture, building and implementing solutions. Client-facing consulting, understanding business problems and translating them into technical solutions. Mentoring and guiding junior team members, fostering a high-performance team culture. You will work closely with clients to design, build, and deliver impactful Data & AI solutions while establishing best practices and maintaining a high bar for technical quality. Key Responsibilities: Solution Design & Delivery Architect and implement end-to-end AI/ML solutions , including: Data pipelines and scalable data architectures. Traditional ML models and workflows. GenAI and agentic AI systems, including retrieval-augmented generation (RAG) and LLM-based applications. Ensure delivery of high-quality, scalable, and maintainable solutions aligned with client needs. Establish and advocate best practices for MLOps and data engineering workflows. Consulting & Client Engagement Act as a trusted technical advisor for clients, shaping their Data & AI strategy and roadmaps. Translate business problems into technical solutions with clear articulation of value. Facilitate technical discussions and workshops with stakeholders to gather requirements and guide solutions. Technical Leadership Lead by example, contributing hands-on to critical parts of projects. Set code and architectural standards for AI and data projects. Stay current with industry trends and advancements in AI/ML, GenAI, and data engineering. Team Development Mentor and upskill junior engineers and data scientists within the team. Foster a collaborative, inclusive, and supportive team environment. Support the hiring and onboarding of new team members as we grow our Data & AI capability. Key Requirements: ? Experience: 1012 years in the software/technology industry. Minimum 5+ years of experience designing and building AI/ML solutions in production environments. Strong experience with data engineering and architecture in production systems. Experience working in consulting or client-facing delivery environments . ? Technical Skills: Deep knowledge of traditional ML (classification, regression, NLP, computer vision) and GenAI (LLMs, embeddings, RAG, agentic AI workflows). Hands-on experience with AI/ML frameworks (TensorFlow, PyTorch, Scikit-learn, LangChain/LlamaIndex). Proficiency in Python (and/or R, Scala) for data and AI workloads. Experience with data engineering tools and orchestration frameworks (Spark, Databricks, Kafka, Airflow). Strong familiarity with cloud platforms (AWS, GCP, Azure) for deploying AI and data solutions. Understanding of MLOps practices (CI/CD for ML, monitoring, model retraining pipelines). Experience with data modeling, data lakes, and data pipeline architecture . ? Leadership & Mindset: Ability to lead and mentor technical teams in delivery environments. A consulting mindset with the ability to communicate effectively with technical and non-technical stakeholders. Empathetic leadership style, fostering trust and team growth. Comfortable in fast-paced, dynamic, client-facing environments. Nice to Have: Experience with LLM fine-tuning and optimization. Strong hands-on experience with Databricks for scalable data and AI workloads. Exposure to agentic AI frameworks and advanced orchestration for LLM-powered workflows. Certifications in cloud or AI/ML specializations. Experience in growing Data & AI teams within a consulting environment. Why Join Us Opportunity to shape and expand our Data & AI practice from the ground up. Work with diverse clients to solve meaningful and challenging problems. Be part of a collaborative, people-first culture with a focus on growth and learning. Competitive compensation and career advancement opportunities. Show more Show less

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You should have at least 2 years of professional work experience in implementing data pipelines using Databricks and datalake. A minimum of 3 years of hands-on programming experience in Python within a cloud environment (preferably AWS) is necessary for this role. Having 2 years of professional work experience with real-time streaming systems such as Event Grid and Event topics would be highly advantageous. You must possess expert-level knowledge of SQL to write complex, highly-optimized queries for processing large volumes of data effectively. Experience in developing conceptual, logical, and/or physical database designs using tools like ErWin, Visio, or Enterprise Architect is expected. A minimum of 2 years of hands-on experience working with databases like Snowflake, Redshift, Synapse, Oracle, SQL Server, Teradata, Netezza, Hadoop, MongoDB, or Cassandra is required. Knowledge or experience in architectural best practices for building data lakes is a must for this position. Strong problem-solving and troubleshooting skills are necessary, along with the ability to make sound judgments independently. You should be capable of working independently and providing guidance to junior data engineers. If you meet the above requirements and are ready to take on this challenging role, we look forward to your application. Warm Regards, Rinka Bose Talent Acquisition Executive Nivasoft India Pvt. Ltd. Mobile: +91-9632249758 (INDIA) | 732-334-3491 (U.S.A) Email: rinka.bose@nivasoft.com | Web: https://nivasoft.com/,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an AWS Data Engineer, you should have at least 3 years of experience in AWS Data Engineering. Your main responsibilities will include designing and building ETL pipelines and Data lakes to automate the ingestion of both structured and unstructured data. You will need to be proficient in working with AWS big data technologies such as Redshift, S3, AWS Glue, Kinesis, Athena, DMS, EMR, and Lambda for Serverless ETL processes. Knowledge in SQL and NoSQL programming languages is essential, along with experience in batch and real-time pipelines. Your role will require excellent programming and debugging skills in either Scala or Python, as well as expertise in Spark. You should have a good understanding of Data Lake formation, Apache Spark, Python, and hands-on experience in deploying models. Experience in Production migration processes is a must, and it would be advantageous to have familiarity with Power BI visualization tools and connectivity. In this position, you will be tasked with designing, building, and operationalizing large-scale enterprise data solutions and applications. You will also need to analyze, re-architect, and re-platform on-premise data warehouses to data platforms within the AWS cloud environment. Creating production data pipelines from ingestion to consumption using Python or Scala within the AWS big data architecture will be part of your routine. Additionally, you will be responsible for conducting detailed assessments of current state data platforms and developing suitable transition paths to the AWS cloud. If you possess strong data engineering skills and are looking for a challenging role in AWS Data Engineering, this opportunity may be the right fit for you.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Snowflake Data Engineer at our organization, you will play a vital role in designing, developing, and maintaining our data infrastructure. Your responsibilities will include ingesting, transforming, and distributing data using Snowflake and AWS technologies. You will collaborate with various stakeholders to ensure efficient data pipelines and secure data operations. Your key responsibilities will involve designing and implementing data pipelines using Snowflake and AWS technologies. You will leverage tools like SnowSQL, Snowpipe, NiFi, Matillion, and DBT to ingest, transform, and automate data integration processes. Implementing role-based access controls and managing AWS resources will be crucial for ensuring data security and supporting Snowflake operations. Additionally, you will be responsible for optimizing Snowflake queries and data models for performance and scalability. To excel in this role, you should have a strong proficiency in SQL and Python, along with hands-on experience with Snowflake and AWS services. Understanding ETL/ELT tools, data warehousing concepts, and data quality techniques will be essential. Your analytical skills, problem-solving abilities, and excellent communication skills will enable you to collaborate effectively with data analysts, data scientists, and other team members. Preferred skills include experience with data virtualization, machine learning, AI concepts, data governance, and data security best practices. Staying updated with the latest advancements in Snowflake and AWS technologies will be essential for this role. If you are a passionate and experienced Snowflake Data Engineer with 5 to 7 years of experience, we invite you to apply and be a part of our team. This is a full-time position based in Gurgaon, with a hybrid work mode accommodating India, UK, and US work shifts.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You will collaborate with stakeholders, including Domain Leads in Operations, IT, and Data, to understand the business needs and shape the vision and roadmap for data-driven initiatives aligned with strategic priorities. You will contribute to the development of the program vision and communicate the product and portfolio vision to your team. Working closely with data scientists, engineers, and designers, you will ensure products are built efficiently, meet user needs, and provide actionable insights. As a Data Product Owner, you will analyze data sources, data technologies, and vendors providing data services to leverage in the data product roadmap development. You will create necessary ER diagrams, data models, PRD/BRD to convey requirements and be accountable for developing and achieving product level KPIs. Managing data products with a moderate degree of strategy, scope, and complexity, you will ensure data accuracy, consistency, and security by establishing data governance frameworks and implementing data management best practices. In this role, you will collaborate with technology and business leadership to align system/application integrations inline with business goals and priorities. You will own and maintain the product backlog, prioritize its contents, and ensure clear, actionable user stories. Additionally, you will set priorities, actively participate in squad/team quarterly planning, and work closely with the agile working group to clarify business requirements, remove roadblocks, and support alignment around product strategy. Monitoring and maintaining the product health, supporting long-term product viability and efficiency, you will balance long and short-term costs with desired outcomes. You will analyze and report on feasibility, cost of delay ramifications, economies, or other aspects of planned or potential changes to the product. Understanding regulatory, compliance, and industry constraints on the product, you will negotiate with internal and external teams to ensure priorities are aligned across squads/teams both within and outside the portfolio. To qualify for this position, you should hold a Bachelor's degree in computer science, Business Administration, or related field, with a Master's degree preferred. You must have a good understanding of data technologies such as databases, data warehouses, and data lakes, along with proven experience of 5+ years as a Data Product Owner, Data Product Manager, or similar role in data or software development. Strong understanding of Agile methodologies, including Scrum and Kanban, and proficiency in programming languages such as Python, R, SQL, or SAS, and cloud technologies like AWS, Azure, are essential. Excellent analytical, problem-solving, decision-making, communication, negotiation, and interpersonal skills are required, along with proficiency in product management tools and the Microsoft Office Suite. Familiarity with UX/UI design principles, software development lifecycle, and software engineering concepts is a plus, as well as experience in insurance, particularly Commercial & Specialty Insurance products. Experience with product management tools such as JIRA, Trello, or Asana, and proficiency in Microsoft Office Suite is preferred. Familiarity with UX/UI design principles, software development lifecycle (SDLC), and software engineering concepts is a plus. Agile practitioner capabilities and experience working with or in Agile teams are highly valued. Strong teamwork, coordination, organization, and planning skills are necessary. Ability to capture complex requirements in a prioritized backlog and managing stakeholders" requirements are vital for success in this role.,

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

We empower our people to stay resilient and relevant in a constantly changing world. We are looking for individuals who are always seeking creative ways to grow and learn, individuals who aspire to make a real impact, both now and in the future. If this resonates with you, then you would be a valuable addition to our dynamic international team. We are currently seeking a Senior Software Engineer - Data Engineer (AI Solutions). In this role, you will have the opportunity to: - Design, build, and maintain data pipelines to cater to the requirements of various stakeholders, including software developers, data scientists, analysts, and business teams. - Ensure that the data pipelines are modular, resilient, and optimized for performance and low maintenance. - Collaborate with AI/ML teams to support training, inference, and monitoring needs through structured data delivery. - Implement ETL/ELT workflows for structured, semi-structured, and unstructured data using cloud-native tools. - Work with large-scale data lakes, streaming platforms, and batch processing systems to ingest and transform data. - Establish robust data validation, logging, and monitoring strategies to uphold data quality and lineage. - Optimize data infrastructure for scalability, cost-efficiency, and observability in cloud-based environments. - Ensure adherence to governance policies and data access controls across projects. To excel in this role, you should possess the following qualifications and skills: - A Bachelor's degree in Computer Science, Information Systems, or a related field. - Minimum of 4 years of experience in designing and deploying scalable data pipelines in cloud environments. - Proficiency in Python, SQL, and data manipulation tools and frameworks such as Apache Airflow, Spark, dbt, and Pandas. - Practical experience with data lakes, data warehouses (e.g., Redshift, Snowflake, BigQuery), and streaming platforms (e.g., Kafka, Kinesis). - Strong understanding of data modeling, schema design, and data transformation patterns. - Experience with AWS (Glue, S3, Redshift, Sagemaker) or Azure (Data Factory, Azure ML Studio, Azure Storage). - Familiarity with CI/CD for data pipelines and infrastructure-as-code (e.g., Terraform, CloudFormation). - Exposure to building data solutions that support AI/ML pipelines, including feature stores and real-time data ingestion. - Understanding of observability, data versioning, and pipeline testing tools. - Previous engagement with diverse stakeholders, data requirement gathering, and support for iterative development cycles. - Background or familiarity with the Power, Energy, or Electrification sector is advantageous. - Knowledge of security best practices and data compliance policies for enterprise-grade systems. This position is based in Bangalore, offering you the opportunity to collaborate with teams that impact entire cities, countries, and shape the future. Siemens is a global organization comprising over 312,000 individuals across more than 200 countries. We are committed to equality and encourage applications from diverse backgrounds that mirror the communities we serve. Employment decisions at Siemens are made based on qualifications, merit, and business requirements. Join us with your curiosity and creativity to help shape a better tomorrow. Learn more about Siemens careers at: www.siemens.com/careers Discover the Digital world of Siemens here: www.siemens.com/careers/digitalminds,

Posted 3 days ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will be part of a collaborative team of engineers and product managers, working alongside technology and business partners to support data initiatives that contribute to bps digital transformation and platform capabilities. Let me tell you about the role As a Data Visualization Platform Engineer, you will support the development, integration, and security of data platforms that power enterprise applications. You will work closely with engineers and architects to help maintain performance, resilience, and compliance across bps cloud and data ecosystems. This role is a great opportunity to grow your platform engineering skills while contributing to real-world solutions. What you will deliver Assist in platform engineering activities including configuration, integration, and maintenance of enterprise data systems. Support CI/CD implementation and Infrastructure-as-Code adoption to improve consistency and efficiency. Help monitor and improve platform performance, availability, and reliability. Collaborate on basic security operations, including monitoring, identity access controls, and remediation activities. Participate in the delivery of data pipelines and platform features across cloud environments. Contribute to documentation, testing, and process improvements across platform workflows. Work with teammates to ensure data systems meet compliance, governance, and security expectations! What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelors degree in technology, engineering, or a related fieldor equivalent hands-on experience. 24 years of experience in IT or platform/data engineering roles. Familiarity with CI/CD tools and Infrastructure-as-Code (e.g., Terraform, Azure Bicep, or AWS CDK). Basic experience with Python, Java, or Scala for scripting or automation. Exposure to data pipeline frameworks (e.g., Airflow, Spark, Kafka) and cloud platforms (AWS, Azure, or GCP). Understanding of data modeling, data lakes, SQL/NoSQL databases, and cloud-native tools. Ability to work collaboratively with cross-functional teams and follow structured engineering practices. Essential Skills Proven technical expertise in Microsoft Azure, AWS, Databricks, and Palantir. Understanding of data pipelines, ingestion, and transformation workflows. Awareness of platform security fundamentals and data governance principles. Familiarity with data visualization concepts and tools (e.g., Power BI, Tableau, or similar). Exposure to distributed systems and working with real-time or batch data processing frameworks. Willingness to learn and adapt to evolving technologies in data engineering and platform operations. Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Even though the job is advertised as full time, please contact the hiring manager or the recruiter as flexible working arrangements may be considered. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management + 4 more Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bps recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 3 days ago

Apply

2.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will work as a member of a high-energy, top-performing team of engineers, working alongside technology leaders to shape the vision and drive the execution of ground-breaking compute and data platforms that make a real impact. Let me tell you about the role As an Azure Platform Operations Engineer, you will be responsible for the monitoring, maintenance, and support of cloud solutions using various cloud services and tools. This role is part of a highly focused squad that uses several agile methodologies and techniques to ensure performance, reliability, and operational excellence across multiple facets of the cloud simultaneously. What you will deliver Maintain and develop scripts and code to automate infrastructure provisioning, monitoring, and configuration using Infrastructure-as-Code (IaC) principles and best practices. Monitor and optimize the capacity, performance, and cost of cloud resources based on business needs and budget constraints. Ingest and manage persistent data for logging and audit purposes while ensuring data security and compliance. Support the maintenance and evolution of cloud solutionsresolving issues, reusing code, improving efficiency, and adopting modern technologies. Configure and manage network connectivity, control planes, and internal resource communication across cloud and hybrid environments. Support operational excellence by applying engineering best practices, tooling, testing frameworks, and effective written and verbal communication! Implement operational cloud security controls including Zero Trust, IAM, encryption, firewalls, and thorough code reviewsespecially for AI-generated code or configurations. What you will need to be successful (experience and qualifications) A bachelor&aposs degree in computer science, engineering, or a related field or equivalent work experience. 2 to 5 years of experience in IT, including up to 2 years as a Cloud Operations Engineer or in a similar role. Proficiency in scripting and coding languages such as PowerShell, Python, or C#. Strong knowledge of core cloud services, including virtual machines, containers, PaaS offerings, monitoring, storage, and networking. Experience with CI/CD tools such as Azure DevOps (ADO) or similar platforms for continuous integration and delivery. Familiarity with data platforms including SQL Server, data lakes, and PaaS-based databases. Ability to work both independently and collaboratively within cross-functional teams. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Even though the job is advertised as full time, please contact the hiring manager or the recruiter as flexible working arrangements may be considered. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management + 4 more Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bps recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a GCP Developer, you will be responsible for maintaining the stability of production platforms, delivering new features, and minimizing technical debt across various technologies. You should have a minimum of 4 years of experience in the field. You must have a strong commitment to maintaining high standards and a genuine passion for ensuring quality in your work. Proficiency in GCP, Python, Hadoop, Spark, Cloud, Scala, Streaming (pub/sub), Kafka, SQL, Data Proc, and Data Flow is essential for this role. Additionally, familiarity with data warehouses, distributed data platforms, and data lakes is required. You should possess knowledge in database definition, schema design, Looker Views, and Models. An understanding of data structures and algorithms is crucial for success in this position. Experience with CI/CD practices would be advantageous. This position involves working in a dynamic environment across multiple locations such as Chennai, Hyderabad, and Bangalore. A total of 20 positions are available for qualified candidates.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a member of the Global IT SAP Team at Shure, you will play a vital role in driving business transformation and maximizing business value through the implementation and support of SAP solutions. Reporting to the Associate Director, SAP Business Applications Finance, you will collaborate with internal IT associates and business users globally to build, enhance, and support solutions that align with industry best practices and technology trends. Your responsibilities will include contributing to requirement gathering, solution design, configuration, testing, and implementation of end-to-end solutions. You will work closely with business stakeholders to understand requirements, provide deep SAP functional expertise, and analyze key integration points. Adhering to IT guiding principles, you will focus on leveraging standard processes, minimizing customization, and driving positive customer experiences. As a SAP Senior Analyst Finance, you will stay updated on evolving SAP technologies, propose innovative solutions, and provide impact analysis on enhancements or new solutions. Additionally, you will offer application support, collaborate with the SAP development and security teams, and ensure compliance with security and data standards. To qualify for this role, you should hold a Bachelor's degree in Finance, Computer Science, or a related field, with a minimum of 5 years of experience in enterprise systems implementation, specifically in SAP FICO S4HANA. Experience with data warehousing platforms and tools is desirable, along with a strong understanding of SAP modules, technical components, and project management methodologies. Key competencies for success in this role include adaptability, critical thinking, customer focus, decision quality, communication, leadership, drive for results, integrity, relationship building, analytical skills, teamwork, collaboration, and influence. Your ability to quickly learn new concepts, follow operational policies, and travel to remote facilities when required will be essential. At Shure, we are committed to being the most trusted audio brand worldwide, driven by our Core Values of quality, reliability, and innovation. If you are passionate about creating an inclusive and diverse work environment and possess the skills to excel in this role, we encourage you to apply and join our team.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

En tant que Data Product Owner Senior, vous serez responsable de la prparation, de la coordination et du suivi de la ralisation de projets axs sur la data et l'intelligence artificielle. Votre rle consistera assurer la conception et la livraison de solutions innovantes et axes sur les donnes, en collaboration avec les quipes techniques, les quipes mtier et les clients. Vos principales responsabilits incluront le cadrage des besoins en matire de data et IA, la dfinition des spcifications fonctionnelles, le suivi de la conception et du dveloppement, la gestion de projet agile et l'assurance de la qualit et des performances des solutions livres. Vous serez galement le point de contact privilgi des clients pour leurs projets data/IA, garantissant un alignement stratgique entre leurs objectifs et les solutions proposes. Le profil idal pour ce poste comprend un diplme en ingnierie, informatique ou dans un domaine li la data/IA, avec au moins 5 ans d'exprience dans la gestion de projets data ou IA, de prfrence dans un environnement Agile. Vous devriez avoir une expertise en data et IA, une bonne comprhension des outils et concepts associs, ainsi que des comptences en gestion produit et en communication. La matrise de l'anglais professionnel est galement requise pour interagir avec des clients et des quipes internationaux. En rejoignant EY FABERNOVEL, vous aurez l'opportunit de travailler sur des projets d'envergure, de bnficier d'un accompagnement dans votre carrire, d'avantages attrayants tels qu'un accs des offres privilgies, une prise en charge des repas, des possibilits de tltravail et de remboursement des transports, ainsi qu'un environnement de travail stimulant et propice l'apprentissage continu.,

Posted 4 days ago

Apply

10.0 - 12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Key Job Responsibilities: VOC - VI (Vulnerability Intelligence), ASM (Attack Surface Management) & VM (Vulnerability Management) Expert. Environment / Context Saint Gobain, world leader in the habitat and construction market, is one of the top 100 global industrial groups. Saint-Gobain is present in 68 countries with 171 000 employees. They design, manufacture and distribute materials and solutions which are key ingredients in the wellbeing of each of us and the future of all. They can be found everywhere in our living places and our daily life: in buildings, transportation, infrastructure and in many industrial applications. They provide comfort, performance and safety while addressing the challenges of sustainable construction, resource efficiency and climate change Saint-Gobain GDI Grou p (250 persons at the head office, including 120 that are internal) is responsible for defining, setting up and managing the Group&aposs Information Systems (IS) and Telecom policy with its 1,000 subsidiaries in 6,500 sites worldwide. The GDI Groupe also carries the common means (infrastructures, telecoms, digital platforms, cross-functional applications ). IN DEC, the IT Development Centre of Saint-Gobain, is an entity with a vision to leverage Indias technical skills in the Information Technology domain to provide timely, high-quality and cost-effective IT solutions to Saint-Gobain businesses globally.Within the Cybersecurity Department, t he Cybersecurity Vulnerability Operations Cent er mission is to Identify, assess and confirm vulnerability and threats that can affect the Group. The CyberVOC teams are based out of Paris and Mumbai and consist of skilled persons working in different Service Lines. Mission We are seeking a highly experienced cybersecurity professional to serve as an VOC Expert supporting the Vulnerability Intelligence (VI), Attack Surface Management (ASM), and Vulnerability Management (VM) teams. This role is pivotal in shaping the strategy, defining technical approaches, and supporting day-to-day operationsparticularly complex escalations and automation efforts. The ideal candidate will combine technical mastery in offensive security with practical experience in vulnerability lifecycle management and external attack surface discovery. The expert will act as a senior advisor and technical authority for the analyst teams, while also contributing to the design, scripting, and documentation of scalable security proceess. The VOC Expert is responsible for: Vulnerability Intelligence (VI) Drive the qualification and risk analysis of newly disclosed vulnerabilities. Perform exploit PoC validation when needed to assess practical risk. Maintain and enhance the central VI database, enriched with (EPSS, CVSS, QVS, SG-specific scoring models, and EUVD) Define and automate workflows for: Vulnerability qualification, exposure analysis, and prioritization Ingestion of qualified vulnerability data into the enterprise Data Lake Collaborate on documentation of VI methodology and threat intelligence integration Support proactive communication of high/critical vulnerabilities to asset and application owners Attack Surface Management (ASM): Operate and enhance external asset discovery and continuous monitoring using ASM tools Integrate asset coverage data from CMDB, and other internal datasets Design and implement scripts for: WHOIS/ASN/banner correlation Data enrichment and alert filtering Deploy and maintain custom scanning capabilities (e.g., Nuclei integrations) Provide expert input on threat modeling based on exposed assets and external footprint BlackBox Pentesting: Maintain the service delivery of the BlackBox Pentesting platform Automate the export of pentest data and integrate into Data Lake and Power BI dashboards Define and document onboarding workflows for new applications Actively guide analysts in prioritizing pentest requests and validating results. Vulnerability Management: Vulnerability review, recategorization, and false positive identification Proactive vulnerability testing and replay Pre-analyze and consolidate vulnerability data from various scanning tools Prepare concise syntheses of available vulnerabilities Offer guidance to the SO and CISO on vulnerabilities Collaborate with key stakeholders to develop strategies for vulnerability management Assist in defining vulnerability management KPIs and strategic goals Prepare concise, actionable summaries for high-risk vulnerabilities and trends Automate testing actions: Develop scripts and tooling to automate repetitive and complex tasks across VI, ASM and VM. Implement data pipelines to sync outputs from ASM/VI tools to dashboards and reporting engines. Design streamlined workflows for vulnerability lifecyclefrom detection to closure. Collaborate with both offensive and defensive teams to support App managers and Asset managers in remediating vulnerabilities and issues. Skills and Qualifications: Bachelor&aposs degree in Computer Science, Information Security, EXTC or related field; relevant certifications (e.g., CISSP, CCSP, CompTIA Security+) are a plus Proven experience (10+ years) working within the Cybersecurity field, with a focus on offensive security, vulnerability intelligence and attack surface analysis. Proven experience on Penetration testing actions (web application, infrastructure, ) Proven expertise in: CVE analysis, exploit development/validationExternal asset discovery & mapping Threat modeling and prioritizationAdvanced knowledge of tooling such as: ASM platforms Nuclei, Shodan, Open Source CTI, vulnerability scanners (Qualys, Tenable, ) Pentester tools (Burp, SQLmap, Responder, IDA and Kali environment) Experience in investigating newly published vulnerabilities, assessing their risks, severity. Strong scripting languages (e.g., Python, Bash, Powershell, C#, ) for automation and customization Experience with Pentester tools (Burp, SQLmap and Kali environment) Strong technical skills with an interest in open-source intelligence investigations Experience building dashboards in Power BI or similar tools. Familiarity with data lakes, API integrations, and ETL processes. Knowledge of NIST CVE database, OWASP Top 10, Microsoft security bulletins Excellent writing skills in English and ability to communicate complicate technical challenges in a business language to a range of stakeholders. Personal Skills: Has a systematic, disciplined, and analytical approach to problem solving with Thorough leadership skills & experience Excellent ability to think critically underpressure Strong communication skills to convey technical concepts clearly to both technical and non-technical stakeholders Willingness to stay updated with evolving cyber threats, technologies, and industry trends Capacity to work collaboratively with cross-functional teams, developers, and management to implement robust security measures Additional Information: The position is based in Mumbai (India) Show more Show less

Posted 4 days ago

Apply

0.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Summary This position provides strategic, analytical, and technical support for data and business intelligence activities. This position leverages data to gain key insight into business opportunities, and effectively presents these insights to business stakeholders. This position participates in the creation, distribution, and delivery of analytics reports, tables, graphs, and communication materials that effectively summarize findings and support recommendations. Primary Skills (must Have) Strong hands-on experience Data Warehouse, building Data Lakes Semantic Model Development (Dimensional, Tabular), SSAS, AAS, LookML Strong dashboarding skills - PowerBI (preferred) / Tableau Hands on experience in SQL, DAX, Python, R Hands on experience in Google Cloud Platform & DevOps(CI/CD) Strong analytical skills and attention to detail Proven ability to quickly learn new applications, processes, and procedures. Able and willing to collaborate in a team environment and exercise independent judgement. Excellent verbal and written communication skills. Ability to form good partner relationships across functions. Secondary Skills Google Cloud Platform (preferred), Azure Agile experience (Scrum) Experience in C#, .Net is preferred Responsibilities Designs, develops, and maintains reports and analytical tools and performs ongoing data quality monitoring and refinement. Identifies and analyzes errors and inconsistencies in the data and provides timely resolutions. Translates data results into written reports, tables, graphs, and charts to convey information to management and clients. Creates ad hoc reports and views on a frequent basis to assist management in understanding, researching, and analyzing issues. Uses data mining to extract information from data sets and identify correlations and patterns. Organizes and transforms information into comprehensible structures. Uses data to predict trends in the customer base and consumer populations and performs statistical analysis of data. Identifies and recommends new ways to support budgets by streamlining business processes. Preferences Bachelor&aposs Degree (or internationally comparable degree) Business/Economics, Computer Science, Engineering, Marketing, MIS, Mathematics, or related discipline. Experience with data warehousing, data science software, or similar analytics/business intelligence systems. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less

Posted 4 days ago

Apply

4.0 - 6.0 years

4 - 6 Lacs

Bengaluru, Karnataka, India

On-site

About this role: Wells Fargo is seeking a Senior Data Science Consultant. As Senior Data Science Consultant, you will be responsible for working on projects with opportunities to improve the customer experience using advanced analytics and data solution engineering. The data science team supports automated control and processoptimization/streamliningby developing Advanced Analytical Solutions targeting to minimize compliance and operational risk across multiple lines of business across the bank. More specifically, you will support data exploration, population design, automate data driven review using Advanced automation techniques in SAS/Python/text mining/AI ML techniques. The selected candidate is expected to design analytical solution and generate meaningful business insight and communicate highly complex concepts to business stakeholders in layman term. In this role, you will: Work as technical expert in delivering high quality analytical solution and provide effective business insights Research, design and develop end-to-end advanced analytical solution using data solution engineering, ETL design, applying text mining and NLP Streamline ETL/data flow structure feeding to different analytical solutions through automation Clearly understand and articulate business requirements by leveraging domain understanding of line of business & product/ function and deliver the results underlining the business problem and appropriate business decision levers Identify & leverage appropriate analytical approach from a wide toolkit to make data driven recommendation Required Qualifications: 4+ years of data science experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Master's degree or higher in a quantitative discipline such as mathematics, statistics, engineering, physics, economics, or computer science Desired Qualifications: 4+ years of hands work experience in advanced analytics/ data science and min. 2 years mandatory experience in Risk and Control/Compliance in a Banking domain Engineering Graduate/ Post-graduate in Maths/Stats/Economics/Computer Science Strong expertise in Python and SAS/SQL and text mining/NLP Must have exposure to unstructured data such as contact center technology data (IVR, Telephony, Text, Chat etc) along with Transactional data Exposure to SAS Viya and Data Lakes/Azure/Big data platforms would be a plus Sound knowledge in project documentation framework Must have consultative skills to have the ability to rationalize business need and solution design from business requirements Strong written and verbal communication, presentation and inter-personal skills. Ability to perform analysis, build hypothesis, draw conclusions and communicate clear, actionable recommendation to business leaders & partners. Ability to interact with integrity and a high level of professionalism with all levels of team members and management

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Cyber Security Engineer, you will collaborate closely with the Engineering Organization, IT, Information Security, Software Engineers, and our DevOps departments. Your team will ensure our back-end and front-end services, cloud infrastructure, DevOps pipelines, data pipelines, software and embedded platforms are secured in the most efficient manner. You will work to develop new systems and procedures to counteract threat vectors that arise within our cloud and embedded environments. The ideal candidate will be a meticulous problem solver who can work under pressure when required and will remain current with the latest attack trends and technologies. Other duties to include: Cloud Security Posture Management: Participate in the planning, development, implementation and management of security measures across various cloud platforms to ensure robust security. Threat Detection and Analysis: Utilize advanced security tools like Wiz, BurpSuite, Sumologic, and Sonarqube to identify, analyze, validate, and stop vulnerabilities from entering the environment. Perform regular penetration testing and vulnerability assessments. Data Analysis and Security Monitoring: Conduct comprehensive analysis of security data from microservice architectures, content distribution networks, data lakes, serverless functions, and databases. Use SIEM tools to correlate security events and identify anomalies. Incident Response and Management: Participate in incident response efforts, perform root cause analysis, and implement or suggest corrective actions to mitigate security breaches. Develop and maintain incident response playbooks. Supply Chain Security: Assess and mitigate security risks associated with the supply chain, like open source libraries, ensuring end-to-end security. Software Security Flaws Mitigation: Identify and address software security flaws and misconfigurations to enhance overall security posture. Perform code reviews and static/dynamic analysis. Languages include but not limited to Python, C++, C#, JS, Python, HCL. Security Solutions Development: Develop and implement custom security solutions, minimizing reliance on paid services. Create security automation scripts and integrate security tools into CI/CD pipelines. Automating Security Test Functions: Develop and implement automated security testing functions to ensure continuous security validation. What we offer: Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you'll experience an inclusive culture of acceptance and belonging, where you'll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You'll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you'll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what's possible and bring new solutions to market. In the process, you'll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you're placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do.,

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

Do you want to help solve the world's most pressing challenges such as feeding the world's growing population and slowing climate change AGCO is looking for individuals to join them in making a difference. Currently, AGCO is seeking a Senior Manager, AI & Data Systems Architecture to lead the design, creation, and evolution of system architectures for AI, analytics, and data systems within the organization. As the Senior Manager, AI & Data Systems Architecture, you will collaborate with cross-functional teams, including data engineers, data scientists, and other IT professionals to create architectures that support cutting-edge AI and data initiatives. Your responsibilities will include leading the end-to-end architecture for AI and data systems, designing and implementing data infrastructure and AI platforms, championing cloud adoption strategies, and driving the continuous improvement and evolution of data and AI architectures to meet emerging business needs and industry trends. To qualify for this role, you should have a minimum of 10 years of experience in data architecture, AI systems, or cloud infrastructure, with at least 3-5 years in a leadership role. You should also possess deep hands-on experience with cloud platforms like AWS, Google Cloud Platform (GCP), and Databricks, as well as familiarity with CRM systems like Salesforce and AI systems within those solutions. Additionally, you should have expertise in data architecture, including data lakes, data warehouses, real-time data streaming, and batch processing frameworks. The ideal candidate will have strong leadership and communication skills, with the ability to drive architecture initiatives in a collaborative and fast-paced environment. A Bachelor's degree in Computer Science, Data Science, or a related field is required, while a Master's degree or relevant certifications such as AWS Certified Solutions Architect are preferred. AGCO offers a positive workplace culture that values inclusion and diversity, providing benefits such as health care and wellness plans, flexible work options, and opportunities for personal development and growth. If you are passionate about leveraging innovative technologies to make a positive impact and contribute to the future of agriculture, apply now to join AGCO in their mission.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

At PwC, our team focused on data and analytics applies data to drive insights and guide strategic business decisions. Utilizing advanced analytics techniques, we assist clients in optimizing operations and achieving their goals. As a member of our data analysis team, you will specialize in leveraging sophisticated analytical methods to extract valuable insights from extensive datasets, enabling data-driven decision-making. Your role will involve utilizing skills in data manipulation, visualization, and statistical modeling to support clients in resolving intricate business challenges. We are seeking a visionary Generative AI Architect at the Manager level to join PwC US - Acceleration Center. In this leadership position, you will be responsible for designing and implementing cutting-edge Generative AI solutions using technologies such as Azure OpenAI Service, GPT models, and multi-agent frameworks. Your role will involve driving innovation through scalable cloud architectures, optimizing AI infrastructure, and leading cross-functional teams in deploying transformative AI solutions. The ideal candidate will possess deep expertise in Generative AI technologies, data engineering, Agentic AI, and cloud platforms like Microsoft Azure, with a strong emphasis on operational excellence and ethical AI practices. Responsibilities: - **Architecture Design:** Design and implement scalable, secure, and high-performance architectures for Generative AI applications. Integrate Generative AI models into existing platforms and lead the development of AI agents capable of orchestrating multi-step tasks. - **Model Development And Deployment:** Fine-tune pre-trained generative models, develop data collection and preparation strategies, and deploy appropriate Generative AI frameworks. - **Innovation And Strategy:** Stay updated on the latest Generative AI advancements, recommend innovative applications, and define and execute AI strategy roadmaps. - **Collaboration And Leadership:** Collaborate with cross-functional teams, mentor team members, and lead a team of data scientists, GenAI engineers, devops, and software developers. - **Performance Optimization:** Monitor and optimize the performance of AI models, agents, and systems to ensure robustness and accuracy, as well as optimize computational costs and infrastructure utilization. - **Ethical And Responsible AI:** Ensure compliance with ethical AI practices, data privacy regulations, and governance frameworks, and implement safeguards against bias and misuse. Requirements: - Bachelors or masters degree in computer science, Data Science, or related field. - 8+ years of relevant technical/technology experience, with expertise in GenAI projects. - Advanced programming skills in Python and fluency in data processing frameworks like Apache Spark. - Experience with GenAI foundational models and open-source models. - Proficiency in system design for Agentic architecture and real-time data processing systems. - Familiarity with cloud computing platforms and containerization technologies. - Strong leadership, problem-solving, and analytical abilities. - Excellent communication and collaboration skills. Nice To Have Skills: - Experience with technologies like Datadog and Splunk. - Familiarity with emerging Model Context Protocols and dynamic tool integration. - Relevant solution architecture certificates and continuous professional development in data engineering and GenAI. Professional And Educational Background: BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA/ Any Degree.,

Posted 1 week ago

Apply

12.0 - 22.0 years

25 - 32 Lacs

Chennai, Bengaluru

Work from Office

Technical Manager, Data Engineering Location : Chennai/Bangalore Experience : 15+ Years Employment Type : Full Time Role Description: We are looking for a seasoned Technical Manager to lead our Data Engineering function. This role demands a deep understanding of data architecture, pipeline development, and data infrastructure. The ideal candidate will be a thought leader in the data engineering space, capable of guiding and mentoring a team, collaborating effectively with various business units, and driving the adoption of cutting-edge tools and technologies to build robust, scalable, and efficient data solutions. Responsibilities Define and champion the strategic direction for data engineering, staying abreast of industry trends and emerging technologies. Lead, mentor, and develop a high-performing team of data engineers, fostering a culture of technical excellence, innovation, and continuous learning. Design, implement, and maintain scalable, reliable, and secure data pipelines and infrastructure. Ensure data quality, integrity, and accessibility. Oversee the end-to-end delivery of data engineering projects, ensuring timely completion, adherence to best practices, and alignment with business objectives. Partner closely with pre-sales, sales, marketing, Business Intelligence, Data Science, and other departments to understand data needs, propose solutions, and support resource deployment for active data projects. Evaluate, recommend, and implement new data engineering tools, platforms, and methodologies to enhance capabilities and efficiency. Identify and address performance bottlenecks in data systems, ensuring optimal data processing and storage. Tools & Technologies Cloud Platforms : AWS (S3, Glue, EMR, Redshift, Athena, Lambda, Kinesis), Azure (Data Lake Storage, Data Factory, Databricks, Synapse Analytics), Google Cloud Platform (Cloud Storage, Dataflow, Dataproc, BigQuery). Big Data Frameworks : Apache Spark, Apache Flink, Apache Kafka, HDFS Data Warehousing/Lakes: Snowflake, Databricks Lakehouse, Google BigQuery, Amazon Redshift, Azure Synapse Analytics. ETL/ELT Tools : Apache Airflow, Talend, Informatica, DBT, Fivetran, Stitch. Data Modeling : Star Schema, Snowflake Schema, Data Vault. Databases : PostgreSQL, MySQL, MongoDB, Cassandra, DynamoDB. Programming Languages: Python (Pandas, PySpark), Scala, Java. Containerization/Orchestration : Docker, Kubernetes. Version Control : Git.

Posted 1 week ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies