Jobs
Interviews

211 Data Lakes Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

0 Lacs

punjab

On-site

We are looking for an experienced Data Analytics Engineer specializing in Snowflake with a minimum of 4 years of experience. As an integral member of our data engineering team, you will be responsible for the design, development, and maintenance of data pipelines, working with large datasets in Snowflake. The ideal candidate will also possess skills in PowerBI for data visualization and reporting. You will assist in building, optimizing, and scaling data workflows, contributing to the improvement of data architecture across the organization. Key Responsibilities: - Design & Develop Data Warehouses: Architect and maintain Snowflake Data Warehouses, ensuring optimal performance, scalability, and security. - ETL Development: Build, implement, and manage ETL pipelines to extract, transform, and load data from multiple sources into Snowflake environments. - Schema Design: Create and maintain Snowflake schemas, tables, views, and materialized views to support a wide range of business use cases. - Optimization & Tuning: Perform query optimization, data partitioning, and other techniques to enhance performance and reduce costs. - Collaboration: Work closely with Data Engineers, Data Scientists, and Business Analysts to understand data needs, share insights, and improve data flows. - Data Integration & Automation: Automate data integration workflows, repetitive tasks, and report generation, improving operational efficiency. - Data Governance & Security: Ensure data integrity, security, and compliance with established data governance policies and industry best practices. - Troubleshooting & Issue Resolution: Identify and resolve performance issues or anomalies within the Snowflake environment, ensuring smooth operations. - Platform Upgrades & Migration: Assist in upgrading Snowflake platforms and data migration efforts, maintaining minimal disruptions. Required Skills: - 4+ years of experience with Snowflake Data Warehouse development and management. - Proficiency in SQL and SnowSQL, with a deep understanding of ETL processes and data integration techniques. - Strong expertise in data modeling, schema design, and performance optimization within the Snowflake platform. - Experience working with cloud-based platforms such as AWS, Azure, or GCP. - Familiarity with data pipeline orchestration tools like dbt, Airflow, or similar. - Strong problem-solving skills and ability to optimize complex queries and workflows. - Solid understanding of data governance, security protocols, and best practices in a cloud-based environment. Preferred Skills: - Certification in Snowflake or other relevant cloud technologies. - Familiarity with PowerBI for data visualization, reporting, and dashboard creation. - Knowledge of additional programming languages like Python or Java for enhancing data pipelines. - Experience working with data visualization platforms such as Tableau, Looker, or others. - Exposure to data lakes and advanced data integration techniques. If you're a passionate ETL Developer looking for the next step in your career, join us and make an impact on the data landscape of our organization!,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an AWS Consultant specializing in Infrastructure, Data & AI, and Databricks, you will play a crucial role in designing, implementing, and optimizing AWS Infrastructure solutions. Your expertise will be utilized to deliver secure and scalable data solutions using various AWS services and platforms. Your responsibilities will also include architecting and implementing ETL/ELT pipelines, data lakes, and distributed compute frameworks. You will be expected to work on automation and infrastructure as code using tools like CloudFormation or Terraform, and manage deployments through AWS CodePipeline, GitHub Actions, or Jenkins. Collaboration with internal teams and clients to gather requirements, assess current-state environments, and define cloud transformation strategies will be a key aspect of your role. Your support during pre-sales and delivery cycles will involve contributing to RFPs, SOWs, LOEs, solution blueprints, and technical documentation. Ensuring best practices in cloud security, cost governance, and compliance will be a priority. The ideal candidate for this position will possess 3 to 5 years of hands-on experience with AWS services, a Bachelor's degree or equivalent experience, and a strong understanding of cloud networking, IAM, security best practices, and hybrid connectivity. Proficiency in Databricks on AWS, experience with data modeling, ETL frameworks, and working with structured/unstructured data are required skills. Additionally, you should have working knowledge of DevOps tools and processes in the AWS ecosystem, strong documentation skills, and excellent communication abilities to translate business needs into technical solutions. Preferred certifications for this role include AWS Certified Solutions Architect - Associate or Professional, AWS Certified Data Analytics - Specialty (preferred), and Databricks Certified Data Engineer Associate/Professional (a plus).,

Posted 1 month ago

Apply

10.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Education: Bachelors or Masters degree in Finance, Accounting, Information Systems, or Actuarial Science. Experience Bachelors degree in Finance, Economics, or a related discipline. 10+ years of experience as a BSA or similar role in data analytics or technology projects. 5+ years of domain experience in asset management, investment management, insurance, or financial services. Familiarity with Investment Operations concepts such as Critical Data Elements (CDEs), data traps, and reconciliation workflows. Working knowledge of data engineering principles: ETL/ELT, data lakes, and data warehousing. Proficiency in BI and analytics tools such as Power BI, Tableau, MicroStrategy, and SQL. Excellent communication, analytical thinking, and stakeholder engagement skills. Experience working in Agile/Scrum environments with cross-functional delivery teams. Show more Show less

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

Embark upon a transformative journey as a Vice President Solutions Architect at Barclays. You will play a crucial role in designing, developing, and implementing solutions to complex business problems. This includes collaborating with stakeholders to understand their needs and requirements, and creating solutions that balance technology risks against business delivery while driving consistency. To be successful in this role, you should be capable of engaging directly with Director and MD-level stakeholders. The ideal candidate can translate complex technical concepts into clear, concise, and executive-friendly narratives. You should have proven experience in designing and building highly scalable and resilient global scale financial systems in a regulated environment. Additionally, experience in owning end-to-end technical and application architecture, working with DevOps operating models, and technical expertise in Java or other programming languages is required. Your responsibilities will include designing and developing solutions as products that align with modern software engineering practices, applying an appropriate workload placement strategy, and incorporating security principles to meet the Banks resiliency expectations. You will also be expected to assess risk, capacity, and cost impact of solution design and contribute to governance processes. As a Vice President, you will advise key stakeholders, manage and mitigate risks, demonstrate leadership in managing risk and controls, and collaborate with various areas to support business strategies. You will need to create solutions based on sophisticated analytical thought and maintain trusting relationships with internal and external stakeholders to achieve key business objectives. All colleagues at Barclays are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as embody the Barclays Mindset of Empower, Challenge, and Drive.,

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About company : Netenrich boosts the effectiveness of organizations security and digital operations so they can avoid disruption and manage risk. Resolution Intelligence CloudTM is our native-cloud data analytics platform for enterprises and services providers that need highly scalable, multitenant security operations and/or digital operations management. Resolution Intelligence Cloud transforms security and operations data into intelligence that organizations can act on before critical issues occur. More than 3,000 customers and managed service providers rely on Netenrich to deliver secure operations at scale. Job Title: Implementation Engineer Years of Experience: Relevant 3+ Years Work Location: Hyderabad Job Summary: We are seeking a skilled and experienced Cybersecurity Implementation Engineer with expertise in customer parser development, Yara rules creation, playbook implementation, and data ingestion techniques. This role presents an exciting opportunity to contribute to the design and implementation of cutting-edge cybersecurity solutions while collaborating with a talented team of professionals. Responsibilities: Develop custom parsers to extract and normalize data from diverse sources, including logs, network traffic, and endpoint data. Design, develop, and maintain Yara rules for threat detection and malware analysis, ensuring high accuracy and effectiveness. Create and implement playbook automation to streamline incident response processes and improve operational efficiency. Design and implement data ingestion pipelines to collect, process, and analyze large volumes of security data from various sources. Collaborate with cross-functional teams to understand customer requirements and customize cybersecurity solutions to meet their needs. Conduct research and analysis to identify emerging threats and vulnerabilities, and develop proactive detection mechanisms. Participate in security incident response activities, providing technical expertise and support as needed. Stay abreast of the latest cybersecurity trends, technologies, and best practices, and share knowledge with the team. Work closely with customers to understand their security challenges and requirements, and provide expert guidance and support. Qualifications: Bachelors degree in Computer Science, Information Security, or related field. 3 years of experience in cybersecurity, with a focus on implementation. Strong expertise in developing custom parsers for log and data normalization. Proficiency in creating and maintaining Yara rules for threat detection and malware analysis. Experience in designing and implementing playbook automation using tools such as Demisto, Phantom, or similar platforms. Solid understanding of data ingestion techniques and technologies, including log management systems and data lakes. Hands-on experience with SIEM (Security Information and Event Management) solutions such as Splunk, ELK, or QRadar. Excellent analytical and problem-solving skills, with the ability to troubleshoot complex technical issues. Strong communication and interpersonal skills, with the ability to effectively collaborate with internal teams and customers. Relevant cybersecurity certifications (e.g., CISSP, CEH, GIAC) are a plus. If you are a passionate and driven cybersecurity professional with expertise in customer parser development, Yara rules creation, playbook implementation, and data ingestion techniques, we want to hear from you. Join us in our mission to protect our organization and our customers from cyber threats. Show more Show less

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

As an experienced IT professional with over 5 years of experience, you should have a good understanding of analytics tools to effectively analyze data. Your previous roles may have involved working in production deployment and production support teams. You must be familiar with Big Data tools such as Hadoop, Spark, Apache Beam, and Kafka. Additionally, your expertise should include object-oriented/object function scripting languages like Python, Java, C++, and Scala. Experience with data warehousing tools like BQ, Redshift, Synapse, or Snowflake is essential. You should also be well-versed in ETL processes and have a strong understanding of relational and non-relational databases including MySQL, MS SQL Server, Postgres, MongoDB, and Cassandra. Familiarity with cloud platforms like AWS, GCP, and Azure is also required, along with experience in workflow management using tools like Apache Airflow. In your role, you will be expected to develop high-performance and scalable solutions using GCP for extracting, transforming, and loading big data. You will design and build production-grade data solutions from ingestion to consumption using Java or Python. Optimizing data models on GCP cloud with data stores such as BigQuery will be part of your responsibilities. Furthermore, you should be capable of handling the deployment process, optimizing data pipelines for performance and cost in large-scale data lakes, and writing complex queries across large data sets. Collaboration with Data Engineers to identify the right tools for delivering product features is essential, as well as researching new use cases for existing data. Preferred qualifications include awareness of design best practices for OLTP and OLAP systems, participation in team designing the database and pipeline, exposure to load testing methodologies, debugging pipelines, and handling delta loads in heterogeneous migration projects. Overall, you should be a collaborative team player who interacts effectively with business stakeholders, BAs, and other Data/ML engineers to drive innovation and deliver impactful solutions.,

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Firebolt Firebolt is the Cloud Data Warehouse designed to handle the speed, scale, and flexibility of AI applications. By delivering ultra-low latency, high concurrency, multi-dimensional elasticity, and flexibility, Firebolt empowers organizations to build data-intensive AI applications that perform at scale. With over $270m in funding to date, a strong engineering team and highly experienced leadership, Firebolt is well positioned to revolutionize the AI data infrastructure space and help businesses unlock the full potential of their data. About the role Were looking for a Product Engineer who thrives at the intersection of engineering and storytelling. In this role, youll be responsible for helping data engineers and developers deeply understand what makes Firebolt unique and how to use it to build sub-second analytics experiences at scale. Youll bring a strong technical foundation and real-world experience building analytics or data-intensive applications. Youll use that expertise to craft high-quality content and experiences that resonate with a deeply technical audience - across formats like blog posts, demos, videos, documentation, and conference talks. This role is ideal for someone who wants to stay close to the product and technology, while shaping how it is experienced and understood by the outside world. Youll work cross-functionally with product, engineering, marketing, and customer-facing teams to translate technical capabilities into clear, compelling narratives. What youll do Build demos, apps, and technical content that help customers see whats possible with Firebolt and how to use it to deliver ultra-fast, scalable analytics. Help shape how we talk about Firebolt to developers and data engineers translating technical features into real-world use cases that resonate. Partner with Product and Engineering to stay ahead of upcoming releases and ensure our documentation, demos, and messaging are ready to support them. Collaborate with Marketing and Sales to create assets that help users understand Firebolts value and get started quickly whether theyre evaluating the product, using it in production, or building apps on top. Contribute to our presence in the data and developer ecosystem by speaking at events, publishing technical content, or engaging with users in forums and Slack. Act as a voice for the developer bringing feedback from the field into product discussions and helping Firebolt stay grounded in real-world needs. Requirements 5+ years in engineering, solutions engineering, solution architect roles. Proven experience building production-grade analytics systems, data pipelines, or data applications. Strong understanding of modern data infrastructure, with hands-on experience using cloud data warehouses and/or data lakes. Fluent in SQL, and comfortable with performance optimization and data modeling. Excellent written and verbal communication skills, with the ability to translate complex technical topics into engaging content. Experience creating developer-facing content such as technical blogs, demo apps, product tutorials, or internal enablement. Self-starter with strong project management skills and the ability to lead initiatives from concept to execution. Collaborative team player who enjoys working across disciplines and contributing to shared goals. Curious and connected know whats happening in the industry, what users are building, and what tools they love (or hate). Bonus if you have Prior experience working in startups or fast-paced product organizations. Background in AI, machine learning or dev tools. Experience speaking at industry conferences, running webinars, or building video tutorials. Contributions to open-source projects or active participation in developer/data communities. Show more Show less

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

Embark upon a transformative journey as a Solutions Architect. At Barclays, you don't just embrace change you drive it. As a Solutions Architect, you will design, develop, and implement solutions to complex business problems. You will collaborate with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs while balancing technology risks against business delivery and driving consistency. To be a successful Solutions Architect, you should have experience in designing and building highly scalable and highly resilient global scale financial systems in a highly regulated environment. You should have a proven track record of delivering solutions and roadmaps for small, medium, and large complex business and technical projects of strategic significance. Experience in owning end-to-end technical and application architecture, current and target states, as well as working with relevant business and technical component teams is essential. Additionally, experience in DevOps operating model and tools, technical expertise in Java or other programming languages, data platforms, BI visualization, modern architecture patterns, Cloud capabilities, and hands-on experience in architecting cloud solutions are required. Exposure to Service-Oriented Architecture design principles, integration and implementation issues, and knowledge of technologies used by financial service providers and in banking organizations are important. You should have the ability to multi-task, handle solutions related to multiple projects and stakeholders simultaneously, and manage competing priorities against demanding timelines. Experience working with senior stakeholders and relevant certifications such as TOGAF or BCS accreditation are desired. Additional skills in banking applications and infrastructure, understanding of project lifecycles, major phases, and different project methodologies are highly valued. The role is based in Pune. **Purpose of the role:** To design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements and driving consistency in technology risks against business delivery. **Accountabilities:** - Design and develop solutions as products that can evolve to meet business requirements aligned with modern software engineering practices. - Implement technologies and platforms for targeted design activities that maximize the benefit of cloud capabilities. - Incorporate security principles in best practice designs to meet the Bank's resiliency expectations. - Balance risks and controls to deliver agreed business and technology value. - Adopt standardized solutions or contribute to their evolution where appropriate. - Provide support for fault finding and performance issues to operational support teams. - Assess solution design impact in terms of risk, capacity, and cost impact. **Vice President Expectations:** - Advise key stakeholders and demonstrate leadership in managing risk and strengthening controls. - Collaborate with other areas of work and contribute to achieving the goals of the business. - Create solutions based on sophisticated analytical thought and innovative problem-solving. - Build and maintain trusting relationships with internal and external stakeholders to achieve key business objectives. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset of Empower, Challenge, and Drive.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Quality Engineer, your primary responsibility will be to analyze business and technical requirements to design, develop, and execute comprehensive test plans for ETL pipelines and data transformations. You will perform data validation, reconciliation, and integrity checks across various data sources and target systems. Additionally, you will be expected to build and automate data quality checks using SQL and/or Python scripting. It will be your duty to identify, document, and track data quality issues, anomalies, and defects. Collaboration is key in this role, as you will work closely with data engineers, developers, QA, and business stakeholders to understand data requirements and ensure that data quality standards are met. You will define data quality KPIs and implement continuous monitoring frameworks. Participation in data model reviews and providing input on data quality considerations will also be part of your responsibilities. In case of data discrepancies, you will be expected to perform root cause analysis and work with teams to drive resolution. Ensuring alignment to data governance policies, standards, and best practices will also fall under your purview. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Additionally, you should have 4 to 7 years of experience as a Data Quality Engineer, ETL Tester, or a similar role. A strong understanding of ETL concepts, data warehousing principles, and relational database design is essential. Proficiency in SQL for complex querying, data profiling, and validation tasks is required. Familiarity with data quality tools, testing methodologies, and modern cloud data ecosystems (AWS, Snowflake, Apache Spark, Redshift) will be advantageous. Moreover, advanced knowledge of SQL, data pipeline tools like Airflow, DBT, or Informatica, as well as experience with integrating data validation processes into CI/CD pipelines using tools like GitHub Actions, Jenkins, or similar, are desired qualifications. An understanding of big data platforms, data lakes, non-relational databases, data lineage, master data management (MDM) concepts, and experience with Agile/Scrum development methodologies will be beneficial for excelling in this role. Your excellent analytical and problem-solving skills along with a strong attention to detail will be valuable assets in fulfilling the responsibilities of a Data Quality Engineer.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Data Engineer at GlobalLogic, you will be responsible for architecting, building, and maintaining complex ETL/ELT pipelines for batch and real-time data processing using various tools and programming languages. Your key duties will include optimizing existing data pipelines for performance, cost-effectiveness, and reliability, as well as implementing data quality checks, monitoring, and alerting mechanisms to ensure data integrity. Additionally, you will play a crucial role in ensuring data security, privacy, and compliance with relevant regulations such as GDPR and local data laws. To excel in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. Excellent analytical, problem-solving, and critical thinking skills with meticulous attention to detail are essential. Strong communication (written and verbal) and interpersonal skills are also required, along with the ability to collaborate effectively with cross-functional teams. Experience with Agile/Scrum development methodologies is considered a plus. Your responsibilities will involve providing technical leadership and architecture by designing and implementing robust, scalable, and efficient data architectures that align with organizational strategy and future growth. You will define and enforce data engineering best practices, evaluate and recommend new technologies, and oversee the end-to-end data development lifecycle. As a leader, you will mentor and guide a team of data engineers, conduct code reviews, provide feedback, and promote a culture of engineering excellence. You will collaborate closely with data scientists, data analysts, software engineers, and business stakeholders to understand data requirements and translate them into technical solutions. Your role will also involve communicating complex technical concepts and data strategies effectively to both technical and non-technical audiences. At GlobalLogic, we offer a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust environment. By joining our team, you will have the chance to work on impactful projects, engage your curiosity and problem-solving skills, and contribute to shaping cutting-edge solutions that redefine industries. With a commitment to integrity and trust, GlobalLogic provides a safe, reliable, and ethical global environment where you can thrive both personally and professionally.,

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

As an experienced professional with 5 to 10 years of experience in the field of information technology, you will be responsible for creating data models for corporate analytics in compliance with standards, ensuring usability and conformance across the enterprise. Your role will involve developing data strategies, ensuring vocabulary consistency, and managing data transformations through intricate analytical relationships and access paths, including data mappings at the data-field level. Collaborating with Product Management and Business stakeholders, you will identify and evaluate data sources necessary to achieve project and business objectives. Working closely with Tech Leads and Product Architects, you will gain insights into end-to-end data implications, data integration, and the functioning of business systems. Additionally, you will collaborate with DQ Leads to address data integrity improvements and quality resolutions at the source. This role requires domain knowledge in supply chain, retail, or inventory management. The critical skills needed for this position include a strong understanding of various software platforms and development technologies, proficiency in SQL, RDBMS, Data Lakes, and Warehouses, and knowledge of the Hadoop ecosystem, Azure, ADLS, Kafka, Apache Delta, and Databricks/Spark. Experience with data modeling tools like ERStudio or Erwin would be advantageous. Effective collaboration with Product Managers, Technology teams, and Business Partners, along with familiarity with Agile and DevOps techniques, is essential. Excellent communication skills, both written and verbal, are also key for success in this role. Preferred qualifications for this position include a bachelor's degree in business information technology, computer science, or a related discipline. This is a full-time position located in Bangalore, Bengaluru, Delhi, Kolkata, or Navi Mumbai. If you meet these requirements and are interested in this opportunity, please apply online. The digitalxnode evaluation team will review your resume, and if your profile is selected, they will reach out to you for further steps. We will retain your information in our database for future job openings.,

Posted 1 month ago

Apply

15.0 - 19.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for a highly skilled and experienced Data Architect to join our team. With at least 15 years of experience in Data engineering and Analytics, the ideal candidate will have a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will play a key role in designing, creating, deploying, and managing Blackbaud's data architecture. This position holds significant technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. You will act as an evangelist for proper data strategy across various teams within Blackbaud, and provide technical guidance, particularly in the area of data, for other projects. Responsibilities: - Develop and direct the strategy for all aspects of Blackbaud's Data and Analytics platforms, products, and services. - Set, communicate, and facilitate technical direction for the AI Center of Excellence and beyond collaboratively. - Design and develop innovative products, services, or technological advancements in the Data Intelligence space to drive business expansion. - Collaborate with product management to create technical solutions that address customer business challenges. - Take ownership of technical data governance practices to ensure data sovereignty, privacy, security, and regulatory compliance. - Challenge existing practices and drive innovation in the data space. - Create a data access strategy to securely democratize data and support research, modeling, machine learning, and artificial intelligence initiatives. - Contribute to defining tools and pipeline patterns used by engineers and data engineers for data transformation and analytics support. - Work within a cross-functional team to translate business requirements into data architecture solutions. - Ensure that data solutions prioritize performance, scalability, and reliability. - Mentor junior data architects and team members. - Stay updated on technology trends such as distributed computing, big data concepts, and architecture. - Advocate internally for the transformative power of data at Blackbaud. Required Qualifications: - 15+ years of experience in data and advanced analytics. - Minimum of 8 years of experience with data technologies in Azure/AWS. - Proficiency in SQL and Python. - Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. - Familiarity with Databricks, Microsoft Fabric. - Strong grasp of data modeling, data warehousing, data lakes, data mesh, and data products. - Experience with machine learning. - Excellent communication and leadership abilities. Preferred Qualifications: - Experience with .Net/Java and Microservice Architecture.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

The ideal candidate for this position should have 8-12 years of experience and possess a strong understanding and hands-on experience with Microsoft Fabric. You will be responsible for designing and implementing end-to-end data solutions on Microsoft Azure, which includes data lakes, data warehouses, and ETL/ELT processes. Your role will involve developing scalable and efficient data architectures to support large-scale data processing and analytics workloads. Ensuring high performance, security, and compliance within Azure data solutions will be a key aspect of this role. You should have knowledge of various techniques such as lakehouse and warehouse, along with experience in implementing them. Additionally, you will be required to evaluate and select appropriate Azure services like Azure SQL Database, Azure Synapse Analytics, Azure Data Lake Storage, Azure Databricks, Unity Catalog, and Azure Data Factory. Deep knowledge and hands-on experience with these Azure Data Services are essential. Collaborating closely with business and technical teams to understand and translate data needs into robust and scalable data architecture solutions will be part of your responsibilities. You should also have experience in data governance, data privacy, and compliance requirements. Excellent communication and interpersonal skills are necessary for effective collaboration with cross-functional teams. In this role, you will provide expertise and leadership to the development team implementing data engineering solutions. Working with Data Scientists, Analysts, and other stakeholders to ensure data architectures align with business goals and data analysis requirements is crucial. Optimizing cloud-based data infrastructure for performance, cost-effectiveness, and scalability will be another key responsibility. Experience in programming languages like SQL, Python, and Scala is required. Hands-on experience with MS SQL Server, Oracle, or similar RDBMS platforms is preferred. Familiarity with Azure DevOps and CI/CD pipeline development is beneficial. An in-depth understanding of database structure principles and distributed data processing of big data batch or streaming pipelines is essential. Knowledge of data visualization tools such as Power BI and Tableau, along with data modeling and strong analytics skills is expected. The candidate should be able to convert OLTP data structures into Star Schema and ideally have DBT experience along with data modeling experience. A problem-solving attitude, self-motivation, attention to detail, and effective task prioritization are essential qualities for this role. At Hitachi, attitude and aptitude are highly valued as collaboration is key. While not all skills are required, experience with Azure SQL Data Warehouse, Azure Data Factory, Azure Data Lake, Azure Analysis Services, Databricks/Spark, Python or Scala, data modeling, Power BI, and database migration are desirable. Designing conceptual, logical, and physical data models using tools like ER Studio and Erwin is a plus.,

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job description 5+ years of proven work experience in data modelling related projects as a Data Modeler. Understand and translate business needs into data models supporting long-term solutions. Ability to understand data relationships and can design data models that reflects these relationships and facilitates efficient ingestion, processing and consumption of Data. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Experience working with databases including OLAP/OLTP based data modeling. Perform reverse engineering of physical data models from databases and SQL scripts. Analyze data-related system integration challenges and propose appropriate solutions. Experience in market leading cloud platforms such as Google Cloud Platform (GCP) and Amazon Web Services (AWS). Experience working on 3rd normal forms. Excellent problem solving and communication skills; experience in interacting with technical and non-technical stakeholders at all levels. Bachelor&aposs degree in computer science, information technology or equivalent work experience. Show more Show less

Posted 1 month ago

Apply

10.0 - 12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At CoffeeBeans Consulting, we are a dynamic software consulting firm helping clients solve complex problems through innovative technology solutions. Were expanding our Data & AI practice and looking for a Lead Consultant Data & AI to help us build and deliver high-impact data and AI projects for our clients. Role Overview: Were looking for a hands-on technical leader who combines deep expertise in AI/ML (traditional ML and GenAI) with strong skills in modern data engineering and architecture . This role is: Hands-on engineering and architecture, building and implementing solutions. Client-facing consulting, understanding business problems and translating them into technical solutions. Mentoring and guiding junior team members, fostering a high-performance team culture. You will work closely with clients to design, build, and deliver impactful Data & AI solutions while establishing best practices and maintaining a high bar for technical quality. Key Responsibilities: Solution Design & Delivery Architect and implement end-to-end AI/ML solutions , including: Data pipelines and scalable data architectures. Traditional ML models and workflows. GenAI and agentic AI systems, including retrieval-augmented generation (RAG) and LLM-based applications. Ensure delivery of high-quality, scalable, and maintainable solutions aligned with client needs. Establish and advocate best practices for MLOps and data engineering workflows. Consulting & Client Engagement Act as a trusted technical advisor for clients, shaping their Data & AI strategy and roadmaps. Translate business problems into technical solutions with clear articulation of value. Facilitate technical discussions and workshops with stakeholders to gather requirements and guide solutions. Technical Leadership Lead by example, contributing hands-on to critical parts of projects. Set code and architectural standards for AI and data projects. Stay current with industry trends and advancements in AI/ML, GenAI, and data engineering. Team Development Mentor and upskill junior engineers and data scientists within the team. Foster a collaborative, inclusive, and supportive team environment. Support the hiring and onboarding of new team members as we grow our Data & AI capability. Key Requirements: ? Experience: 1012 years in the software/technology industry. Minimum 5+ years of experience designing and building AI/ML solutions in production environments. Strong experience with data engineering and architecture in production systems. Experience working in consulting or client-facing delivery environments . ? Technical Skills: Deep knowledge of traditional ML (classification, regression, NLP, computer vision) and GenAI (LLMs, embeddings, RAG, agentic AI workflows). Hands-on experience with AI/ML frameworks (TensorFlow, PyTorch, Scikit-learn, LangChain/LlamaIndex). Proficiency in Python (and/or R, Scala) for data and AI workloads. Experience with data engineering tools and orchestration frameworks (Spark, Databricks, Kafka, Airflow). Strong familiarity with cloud platforms (AWS, GCP, Azure) for deploying AI and data solutions. Understanding of MLOps practices (CI/CD for ML, monitoring, model retraining pipelines). Experience with data modeling, data lakes, and data pipeline architecture . ? Leadership & Mindset: Ability to lead and mentor technical teams in delivery environments. A consulting mindset with the ability to communicate effectively with technical and non-technical stakeholders. Empathetic leadership style, fostering trust and team growth. Comfortable in fast-paced, dynamic, client-facing environments. Nice to Have: Experience with LLM fine-tuning and optimization. Strong hands-on experience with Databricks for scalable data and AI workloads. Exposure to agentic AI frameworks and advanced orchestration for LLM-powered workflows. Certifications in cloud or AI/ML specializations. Experience in growing Data & AI teams within a consulting environment. Why Join Us Opportunity to shape and expand our Data & AI practice from the ground up. Work with diverse clients to solve meaningful and challenging problems. Be part of a collaborative, people-first culture with a focus on growth and learning. Competitive compensation and career advancement opportunities. Show more Show less

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You should have at least 2 years of professional work experience in implementing data pipelines using Databricks and datalake. A minimum of 3 years of hands-on programming experience in Python within a cloud environment (preferably AWS) is necessary for this role. Having 2 years of professional work experience with real-time streaming systems such as Event Grid and Event topics would be highly advantageous. You must possess expert-level knowledge of SQL to write complex, highly-optimized queries for processing large volumes of data effectively. Experience in developing conceptual, logical, and/or physical database designs using tools like ErWin, Visio, or Enterprise Architect is expected. A minimum of 2 years of hands-on experience working with databases like Snowflake, Redshift, Synapse, Oracle, SQL Server, Teradata, Netezza, Hadoop, MongoDB, or Cassandra is required. Knowledge or experience in architectural best practices for building data lakes is a must for this position. Strong problem-solving and troubleshooting skills are necessary, along with the ability to make sound judgments independently. You should be capable of working independently and providing guidance to junior data engineers. If you meet the above requirements and are ready to take on this challenging role, we look forward to your application. Warm Regards, Rinka Bose Talent Acquisition Executive Nivasoft India Pvt. Ltd. Mobile: +91-9632249758 (INDIA) | 732-334-3491 (U.S.A) Email: rinka.bose@nivasoft.com | Web: https://nivasoft.com/,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an AWS Data Engineer, you should have at least 3 years of experience in AWS Data Engineering. Your main responsibilities will include designing and building ETL pipelines and Data lakes to automate the ingestion of both structured and unstructured data. You will need to be proficient in working with AWS big data technologies such as Redshift, S3, AWS Glue, Kinesis, Athena, DMS, EMR, and Lambda for Serverless ETL processes. Knowledge in SQL and NoSQL programming languages is essential, along with experience in batch and real-time pipelines. Your role will require excellent programming and debugging skills in either Scala or Python, as well as expertise in Spark. You should have a good understanding of Data Lake formation, Apache Spark, Python, and hands-on experience in deploying models. Experience in Production migration processes is a must, and it would be advantageous to have familiarity with Power BI visualization tools and connectivity. In this position, you will be tasked with designing, building, and operationalizing large-scale enterprise data solutions and applications. You will also need to analyze, re-architect, and re-platform on-premise data warehouses to data platforms within the AWS cloud environment. Creating production data pipelines from ingestion to consumption using Python or Scala within the AWS big data architecture will be part of your routine. Additionally, you will be responsible for conducting detailed assessments of current state data platforms and developing suitable transition paths to the AWS cloud. If you possess strong data engineering skills and are looking for a challenging role in AWS Data Engineering, this opportunity may be the right fit for you.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Snowflake Data Engineer at our organization, you will play a vital role in designing, developing, and maintaining our data infrastructure. Your responsibilities will include ingesting, transforming, and distributing data using Snowflake and AWS technologies. You will collaborate with various stakeholders to ensure efficient data pipelines and secure data operations. Your key responsibilities will involve designing and implementing data pipelines using Snowflake and AWS technologies. You will leverage tools like SnowSQL, Snowpipe, NiFi, Matillion, and DBT to ingest, transform, and automate data integration processes. Implementing role-based access controls and managing AWS resources will be crucial for ensuring data security and supporting Snowflake operations. Additionally, you will be responsible for optimizing Snowflake queries and data models for performance and scalability. To excel in this role, you should have a strong proficiency in SQL and Python, along with hands-on experience with Snowflake and AWS services. Understanding ETL/ELT tools, data warehousing concepts, and data quality techniques will be essential. Your analytical skills, problem-solving abilities, and excellent communication skills will enable you to collaborate effectively with data analysts, data scientists, and other team members. Preferred skills include experience with data virtualization, machine learning, AI concepts, data governance, and data security best practices. Staying updated with the latest advancements in Snowflake and AWS technologies will be essential for this role. If you are a passionate and experienced Snowflake Data Engineer with 5 to 7 years of experience, we invite you to apply and be a part of our team. This is a full-time position based in Gurgaon, with a hybrid work mode accommodating India, UK, and US work shifts.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You will collaborate with stakeholders, including Domain Leads in Operations, IT, and Data, to understand the business needs and shape the vision and roadmap for data-driven initiatives aligned with strategic priorities. You will contribute to the development of the program vision and communicate the product and portfolio vision to your team. Working closely with data scientists, engineers, and designers, you will ensure products are built efficiently, meet user needs, and provide actionable insights. As a Data Product Owner, you will analyze data sources, data technologies, and vendors providing data services to leverage in the data product roadmap development. You will create necessary ER diagrams, data models, PRD/BRD to convey requirements and be accountable for developing and achieving product level KPIs. Managing data products with a moderate degree of strategy, scope, and complexity, you will ensure data accuracy, consistency, and security by establishing data governance frameworks and implementing data management best practices. In this role, you will collaborate with technology and business leadership to align system/application integrations inline with business goals and priorities. You will own and maintain the product backlog, prioritize its contents, and ensure clear, actionable user stories. Additionally, you will set priorities, actively participate in squad/team quarterly planning, and work closely with the agile working group to clarify business requirements, remove roadblocks, and support alignment around product strategy. Monitoring and maintaining the product health, supporting long-term product viability and efficiency, you will balance long and short-term costs with desired outcomes. You will analyze and report on feasibility, cost of delay ramifications, economies, or other aspects of planned or potential changes to the product. Understanding regulatory, compliance, and industry constraints on the product, you will negotiate with internal and external teams to ensure priorities are aligned across squads/teams both within and outside the portfolio. To qualify for this position, you should hold a Bachelor's degree in computer science, Business Administration, or related field, with a Master's degree preferred. You must have a good understanding of data technologies such as databases, data warehouses, and data lakes, along with proven experience of 5+ years as a Data Product Owner, Data Product Manager, or similar role in data or software development. Strong understanding of Agile methodologies, including Scrum and Kanban, and proficiency in programming languages such as Python, R, SQL, or SAS, and cloud technologies like AWS, Azure, are essential. Excellent analytical, problem-solving, decision-making, communication, negotiation, and interpersonal skills are required, along with proficiency in product management tools and the Microsoft Office Suite. Familiarity with UX/UI design principles, software development lifecycle, and software engineering concepts is a plus, as well as experience in insurance, particularly Commercial & Specialty Insurance products. Experience with product management tools such as JIRA, Trello, or Asana, and proficiency in Microsoft Office Suite is preferred. Familiarity with UX/UI design principles, software development lifecycle (SDLC), and software engineering concepts is a plus. Agile practitioner capabilities and experience working with or in Agile teams are highly valued. Strong teamwork, coordination, organization, and planning skills are necessary. Ability to capture complex requirements in a prioritized backlog and managing stakeholders" requirements are vital for success in this role.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

We empower our people to stay resilient and relevant in a constantly changing world. We are looking for individuals who are always seeking creative ways to grow and learn, individuals who aspire to make a real impact, both now and in the future. If this resonates with you, then you would be a valuable addition to our dynamic international team. We are currently seeking a Senior Software Engineer - Data Engineer (AI Solutions). In this role, you will have the opportunity to: - Design, build, and maintain data pipelines to cater to the requirements of various stakeholders, including software developers, data scientists, analysts, and business teams. - Ensure that the data pipelines are modular, resilient, and optimized for performance and low maintenance. - Collaborate with AI/ML teams to support training, inference, and monitoring needs through structured data delivery. - Implement ETL/ELT workflows for structured, semi-structured, and unstructured data using cloud-native tools. - Work with large-scale data lakes, streaming platforms, and batch processing systems to ingest and transform data. - Establish robust data validation, logging, and monitoring strategies to uphold data quality and lineage. - Optimize data infrastructure for scalability, cost-efficiency, and observability in cloud-based environments. - Ensure adherence to governance policies and data access controls across projects. To excel in this role, you should possess the following qualifications and skills: - A Bachelor's degree in Computer Science, Information Systems, or a related field. - Minimum of 4 years of experience in designing and deploying scalable data pipelines in cloud environments. - Proficiency in Python, SQL, and data manipulation tools and frameworks such as Apache Airflow, Spark, dbt, and Pandas. - Practical experience with data lakes, data warehouses (e.g., Redshift, Snowflake, BigQuery), and streaming platforms (e.g., Kafka, Kinesis). - Strong understanding of data modeling, schema design, and data transformation patterns. - Experience with AWS (Glue, S3, Redshift, Sagemaker) or Azure (Data Factory, Azure ML Studio, Azure Storage). - Familiarity with CI/CD for data pipelines and infrastructure-as-code (e.g., Terraform, CloudFormation). - Exposure to building data solutions that support AI/ML pipelines, including feature stores and real-time data ingestion. - Understanding of observability, data versioning, and pipeline testing tools. - Previous engagement with diverse stakeholders, data requirement gathering, and support for iterative development cycles. - Background or familiarity with the Power, Energy, or Electrification sector is advantageous. - Knowledge of security best practices and data compliance policies for enterprise-grade systems. This position is based in Bangalore, offering you the opportunity to collaborate with teams that impact entire cities, countries, and shape the future. Siemens is a global organization comprising over 312,000 individuals across more than 200 countries. We are committed to equality and encourage applications from diverse backgrounds that mirror the communities we serve. Employment decisions at Siemens are made based on qualifications, merit, and business requirements. Join us with your curiosity and creativity to help shape a better tomorrow. Learn more about Siemens careers at: www.siemens.com/careers Discover the Digital world of Siemens here: www.siemens.com/careers/digitalminds,

Posted 1 month ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will be part of a collaborative team of engineers and product managers, working alongside technology and business partners to support data initiatives that contribute to bps digital transformation and platform capabilities. Let me tell you about the role As a Data Visualization Platform Engineer, you will support the development, integration, and security of data platforms that power enterprise applications. You will work closely with engineers and architects to help maintain performance, resilience, and compliance across bps cloud and data ecosystems. This role is a great opportunity to grow your platform engineering skills while contributing to real-world solutions. What you will deliver Assist in platform engineering activities including configuration, integration, and maintenance of enterprise data systems. Support CI/CD implementation and Infrastructure-as-Code adoption to improve consistency and efficiency. Help monitor and improve platform performance, availability, and reliability. Collaborate on basic security operations, including monitoring, identity access controls, and remediation activities. Participate in the delivery of data pipelines and platform features across cloud environments. Contribute to documentation, testing, and process improvements across platform workflows. Work with teammates to ensure data systems meet compliance, governance, and security expectations! What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelors degree in technology, engineering, or a related fieldor equivalent hands-on experience. 24 years of experience in IT or platform/data engineering roles. Familiarity with CI/CD tools and Infrastructure-as-Code (e.g., Terraform, Azure Bicep, or AWS CDK). Basic experience with Python, Java, or Scala for scripting or automation. Exposure to data pipeline frameworks (e.g., Airflow, Spark, Kafka) and cloud platforms (AWS, Azure, or GCP). Understanding of data modeling, data lakes, SQL/NoSQL databases, and cloud-native tools. Ability to work collaboratively with cross-functional teams and follow structured engineering practices. Essential Skills Proven technical expertise in Microsoft Azure, AWS, Databricks, and Palantir. Understanding of data pipelines, ingestion, and transformation workflows. Awareness of platform security fundamentals and data governance principles. Familiarity with data visualization concepts and tools (e.g., Power BI, Tableau, or similar). Exposure to distributed systems and working with real-time or batch data processing frameworks. Willingness to learn and adapt to evolving technologies in data engineering and platform operations. Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Even though the job is advertised as full time, please contact the hiring manager or the recruiter as flexible working arrangements may be considered. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management + 4 more Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bps recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 1 month ago

Apply

2.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will work as a member of a high-energy, top-performing team of engineers, working alongside technology leaders to shape the vision and drive the execution of ground-breaking compute and data platforms that make a real impact. Let me tell you about the role As an Azure Platform Operations Engineer, you will be responsible for the monitoring, maintenance, and support of cloud solutions using various cloud services and tools. This role is part of a highly focused squad that uses several agile methodologies and techniques to ensure performance, reliability, and operational excellence across multiple facets of the cloud simultaneously. What you will deliver Maintain and develop scripts and code to automate infrastructure provisioning, monitoring, and configuration using Infrastructure-as-Code (IaC) principles and best practices. Monitor and optimize the capacity, performance, and cost of cloud resources based on business needs and budget constraints. Ingest and manage persistent data for logging and audit purposes while ensuring data security and compliance. Support the maintenance and evolution of cloud solutionsresolving issues, reusing code, improving efficiency, and adopting modern technologies. Configure and manage network connectivity, control planes, and internal resource communication across cloud and hybrid environments. Support operational excellence by applying engineering best practices, tooling, testing frameworks, and effective written and verbal communication! Implement operational cloud security controls including Zero Trust, IAM, encryption, firewalls, and thorough code reviewsespecially for AI-generated code or configurations. What you will need to be successful (experience and qualifications) A bachelor&aposs degree in computer science, engineering, or a related field or equivalent work experience. 2 to 5 years of experience in IT, including up to 2 years as a Cloud Operations Engineer or in a similar role. Proficiency in scripting and coding languages such as PowerShell, Python, or C#. Strong knowledge of core cloud services, including virtual machines, containers, PaaS offerings, monitoring, storage, and networking. Experience with CI/CD tools such as Azure DevOps (ADO) or similar platforms for continuous integration and delivery. Familiarity with data platforms including SQL Server, data lakes, and PaaS-based databases. Ability to work both independently and collaboratively within cross-functional teams. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Even though the job is advertised as full time, please contact the hiring manager or the recruiter as flexible working arrangements may be considered. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management + 4 more Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bps recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a GCP Developer, you will be responsible for maintaining the stability of production platforms, delivering new features, and minimizing technical debt across various technologies. You should have a minimum of 4 years of experience in the field. You must have a strong commitment to maintaining high standards and a genuine passion for ensuring quality in your work. Proficiency in GCP, Python, Hadoop, Spark, Cloud, Scala, Streaming (pub/sub), Kafka, SQL, Data Proc, and Data Flow is essential for this role. Additionally, familiarity with data warehouses, distributed data platforms, and data lakes is required. You should possess knowledge in database definition, schema design, Looker Views, and Models. An understanding of data structures and algorithms is crucial for success in this position. Experience with CI/CD practices would be advantageous. This position involves working in a dynamic environment across multiple locations such as Chennai, Hyderabad, and Bangalore. A total of 20 positions are available for qualified candidates.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a member of the Global IT SAP Team at Shure, you will play a vital role in driving business transformation and maximizing business value through the implementation and support of SAP solutions. Reporting to the Associate Director, SAP Business Applications Finance, you will collaborate with internal IT associates and business users globally to build, enhance, and support solutions that align with industry best practices and technology trends. Your responsibilities will include contributing to requirement gathering, solution design, configuration, testing, and implementation of end-to-end solutions. You will work closely with business stakeholders to understand requirements, provide deep SAP functional expertise, and analyze key integration points. Adhering to IT guiding principles, you will focus on leveraging standard processes, minimizing customization, and driving positive customer experiences. As a SAP Senior Analyst Finance, you will stay updated on evolving SAP technologies, propose innovative solutions, and provide impact analysis on enhancements or new solutions. Additionally, you will offer application support, collaborate with the SAP development and security teams, and ensure compliance with security and data standards. To qualify for this role, you should hold a Bachelor's degree in Finance, Computer Science, or a related field, with a minimum of 5 years of experience in enterprise systems implementation, specifically in SAP FICO S4HANA. Experience with data warehousing platforms and tools is desirable, along with a strong understanding of SAP modules, technical components, and project management methodologies. Key competencies for success in this role include adaptability, critical thinking, customer focus, decision quality, communication, leadership, drive for results, integrity, relationship building, analytical skills, teamwork, collaboration, and influence. Your ability to quickly learn new concepts, follow operational policies, and travel to remote facilities when required will be essential. At Shure, we are committed to being the most trusted audio brand worldwide, driven by our Core Values of quality, reliability, and innovation. If you are passionate about creating an inclusive and diverse work environment and possess the skills to excel in this role, we encourage you to apply and join our team.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

En tant que Data Product Owner Senior, vous serez responsable de la prparation, de la coordination et du suivi de la ralisation de projets axs sur la data et l'intelligence artificielle. Votre rle consistera assurer la conception et la livraison de solutions innovantes et axes sur les donnes, en collaboration avec les quipes techniques, les quipes mtier et les clients. Vos principales responsabilits incluront le cadrage des besoins en matire de data et IA, la dfinition des spcifications fonctionnelles, le suivi de la conception et du dveloppement, la gestion de projet agile et l'assurance de la qualit et des performances des solutions livres. Vous serez galement le point de contact privilgi des clients pour leurs projets data/IA, garantissant un alignement stratgique entre leurs objectifs et les solutions proposes. Le profil idal pour ce poste comprend un diplme en ingnierie, informatique ou dans un domaine li la data/IA, avec au moins 5 ans d'exprience dans la gestion de projets data ou IA, de prfrence dans un environnement Agile. Vous devriez avoir une expertise en data et IA, une bonne comprhension des outils et concepts associs, ainsi que des comptences en gestion produit et en communication. La matrise de l'anglais professionnel est galement requise pour interagir avec des clients et des quipes internationaux. En rejoignant EY FABERNOVEL, vous aurez l'opportunit de travailler sur des projets d'envergure, de bnficier d'un accompagnement dans votre carrire, d'avantages attrayants tels qu'un accs des offres privilgies, une prise en charge des repas, des possibilits de tltravail et de remboursement des transports, ainsi qu'un environnement de travail stimulant et propice l'apprentissage continu.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies