Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
Role Overview: As a Data Quality Engineer, you will play a crucial role in ensuring the integrity, accuracy, and reliability of the organization's data assets. Your expertise in data engineering, data quality engineering, and cloud technologies in Azure, GCP, or AWS will be vital. Additionally, exposure to artificial intelligence (AI) concepts and tools will be beneficial for this role. Key Responsibilities: - Collaborate with data owners and stakeholders to define and enforce data quality standards. - Develop and implement data cleansing strategies to address identified data quality issues. - Utilize data profiling tools and techniques to analyze data patterns and identify potential quality issues. - Work on optimizing data architectures for improved data quality and performance. - Utilize cloud platforms like Azure, AWS, or GCP to deploy and manage data quality solutions. - Leverage cloud services for scalable storage, processing, and analysis of large datasets. - Document data quality issues, root causes, and remediation steps. - Generate regular reports on data quality metrics and communicate findings to relevant stakeholders. - Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data quality requirements and implement effective solutions. Qualifications: - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. - Proven experience as a Data Quality Engineer or in a similar role. - Strong knowledge of data engineering principles and best practices. - Hands-on experience with cloud platforms, especially Azure. - Familiarity with AI concepts and tools is preferred. - Proficiency in programming languages such as Python, SQL, or Java. - Excellent problem-solving and analytical skills. - Strong communication and collaboration skills.,
Posted 5 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You are a highly skilled Snowflake Developer with extensive experience in designing, implementing, and managing Snowflake-based data solutions. Your role will involve developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Your key responsibilities will include designing and implementing scalable, efficient, and secure Snowflake solutions to meet business requirements. You will also develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. In addition, you will implement Snowflake-based data warehouses, data lakes, and data integration solutions, and manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaboration with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals will be crucial. Furthermore, you will drive continuous improvement by leveraging the latest Snowflake features and industry trends. To qualify for this role, you should have a Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field, along with 8+ years of experience in data architecture, data engineering, or a related field. You must possess extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Exposure working in Airflow is also required, along with a proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services is preferred. A Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) would be a plus.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
kochi, kerala
On-site
You are a skilled Cloud Data Engineer with over 3 years of experience, proficient in working with cloud data platforms like AWS or Azure, with a special focus on Snowflake and dbt. Your role as a Consultant will involve developing new data platforms, creating data processes, and collaborating with cross-functional teams to design, develop, and deploy high-quality frontend solutions. Your responsibilities will include customer consulting to develop data-driven products in Snowflake Cloud, connecting data & analytics with specialist departments, developing ELT processes using dbt (data build tool), specifying requirements for future-proof cloud data architectures, designing scalable data management processes, analyzing data sets to derive sound findings, and presenting them effectively. To excel in this role, you must have experience in successfully implementing cloud-based data & analytics projects, be proficient in DWH/data lake concepts and modeling with Data Vault 2.0, possess extensive knowledge of Snowflake, dbt, and other cloud technologies (e.g. MS Azure, AWS, GCP), have a strong command over SQL, and be familiar with data management topics like master data management and data quality. A Bachelor's degree in computer science or a related field is required. Additionally, you should demonstrate strong communication and collaboration abilities to effectively work in a team environment. If you meet these requirements and are interested in joining our dynamic team, kindly share your CV to m.neethu@ssconsult.in.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Lead QA Engineer at Kenvuepro, you will be responsible for leading automation efforts and establishing a robust website testing framework. Your role will involve ensuring that our web applications are high-performing, reliable, and thoroughly tested. The ideal candidate will have experience working with Contentful as a CMS, Storybook for design components, ReactJS, and integrating automated testing into a CI/CD pipeline. Detail-oriented, proactive, and passionate individuals who are committed to creating quality digital experiences for users will excel in this role. You will work with a global team of developers to build capabilities on our Global digital hub serving healthcare professionals. Collaborating closely with the Global Technical Owner & Scrum master, you will ensure that products are technically sound, meet business requirements, and are delivered on time and within budget. The role requires a strong technical background, hands-on experience with salesforce and salesforce marketing cloud, and expertise in 3rd party integrations and data architectures. Collaboration with business stakeholders to achieve marketing objectives is also a key aspect of this role. Key Responsibilities: - Validate and maintain an automation framework tailored for a CMS-driven, ReactJS-based web platform. - Develop automated tests for integration, functional, and end-to-end testing to ensure thorough coverage for web components and content-driven pages. - Integrate automated tests into the CI/CD pipeline, optimizing for performance and reliability, and implementing solutions to prevent flaky tests. - Collaborate with developers and product managers to develop a testing strategy aligned with project goals and timelines. - Leverage Storybook to test individual React components in isolation and validate them within the test suite. - Develop strategies for handling dynamic content testing with a Content Management system, including test data management and effective mocking. - Develop automated workflows to validate environment upgrades. - Implement cross-browser, responsive, and performance testing to ensure consistent user experiences across devices and regions. - Actively monitor test results, troubleshoot issues, and improve test reliability, communicating findings to development and management teams. - Document testing processes, framework architecture, and test results for ongoing maintainability and knowledge sharing across teams. Qualifications: - Bachelor's degree in computer science or information technology discipline. - 6+ years of experience in automated testing, focusing on framework development and test automation for web applications. - Experience with cross-browser and responsive testing tools like Sauce Labs. - Familiarity with performance testing tools such as Lighthouse or Webpage Test. - Familiarity with Visual Testing tools like Appli tools. - Experience with monitoring tools to maintain automation framework health. Technical Skills: - Proficiency with automation tools like Vividus/Playwright/Lighthouse. - Proficiency in JavaScript/TypeScript/NextJS environments. - Solid understanding of other CMS platforms and experience with API-based testing for CMS-driven applications. - Experience with Storybook for component testing, React Testing Library, and Jest for unit and integration testing. - Deep understanding of CI/CD pipeline integration and tools like Jenkins, GitHub Actions, or GitLab CI. - Strong experience with test design, creating scalable test architectures, and managing test data. - Experience with a Customer Data Platform. - Familiarity with Jira and Xray. Other Requirements: - Excellent communication, collaboration, and critical thinking skills with attention to detail. - Passion for learning new technologies and enhancing existing team processes. - Ability to build custom tooling and documentation for manual testers. Location: Bengaluru, India If you are a proactive, detail-oriented individual with a passion for quality digital experiences and a strong technical background, we invite you to join us at Kenvuepro. Shape our future and yours while impacting the lives of millions of people every day.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a Power BI Architect - Data Engineer, you will play a crucial role in designing, implementing, and managing comprehensive business intelligence solutions. Your focus will be on data modeling, report development, and ensuring data security and compliance. Working within high-performing and collaborative teams, you will present data migration solutions and influence key stakeholders in client groups. Your expertise will assist clients in driving towards strategic data architecture goals by enhancing the coherence, quality, security, and availability of the organization's data assets through the development of data migration roadmaps. Your responsibilities will include designing and leading real-time data architectures for large volumes of information, implementing integration flows with Data Lakes and Microsoft Fabric, optimizing and governing tabular models in Power BI, and ensuring high availability, security, and scalability. You will also coordinate data quality standards with a focus on DataOps for continuous deployments and automation. To be successful in this role, you should have demonstrable experience in Master data management and at least 7 years of experience in designing and implementing BI solutions and data architectures. You must possess advanced modeling skills, proficiency in DAX, and expertise in optimization and governance. Strong knowledge and mastery of Data Lake, Microsoft Fabric, and real-time ingestion methods are essential. Hands-on experience and knowledge of Python or R for data manipulation/transformation and automation are also required. Additionally, you should have proven experience in tabular modeling, DAX queries, and report optimization in Power BI. Your ability to plan, define, estimate, and manage the delivery of work packages using your experience will be crucial. Excellent communication skills and flexibility to respond to various program demands are essential for this role. You should have a deep understanding of key technical developments in your area of expertise and be able to lead the definition of information and data models, data governance structures, and processes. Experience in working in complex environments across multiple business and technology domains is preferred, along with the ability to bridge the gap between functional and non-functional teams.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Consultant with a focus on Data at AIONEERS-EFESO, an integral part of EFESO Management Consultants, you will play a crucial role in optimizing supply chains and achieving best-in-class standards for businesses. Your responsibilities will involve collaborating on customer and internal projects, specifically focusing on Data for Supply Chain Planning and Supply Chain Analytics. This will encompass tasks such as analyzing data, implementing data governance and management practices, and standardizing processes to enhance supply chain efficiency. Throughout projects, your main focus will be on developing master data models, monitoring the effectiveness of data quality enhancements, and presenting these improvements through dashboards in Power BI. Acting as the liaison between business and IT, you will contribute to the design, implementation, and integration of data into Advanced Planning Systems (APS), sourced from ERP systems like SAP, Oracle, or Microsoft Dynamics. Your role will also entail leading subprojects independently, guiding colleagues and stakeholders in defining and executing data architectures, and implementing data visualization tools to communicate analytical findings effectively. To excel in this position, you should possess at least 3 years of professional experience in consulting, analytics, Master Data Management, or Supply Chain Management. Strong expertise in SAP, Power BI, or similar analytics tools is essential, along with a relevant degree in fields like Supply Chain, IT, industrial engineering, or business administration. Your capabilities should extend to analyzing, building, and maintaining various data structures, ensuring seamless integration with Supply Chain Planning Systems like BY, o9, and Kinaxis. Proficiency in BI tools such as Power BI, QlikView, Qlik Sense, or Tableau is required for effective data visualization and reporting. Furthermore, experience in software implementation projects and familiarity with the Scrum methodology are advantageous. Proficiency in MS Office products, particularly MS Excel and PowerPoint, is necessary. Knowledge of end-to-end Supply Chain Planning processes, with a focus on data quality, governance, and standardization, would be a significant asset. Fluency in English is a must, with additional language skills in German, French, or Italian being beneficial. A keen interest in new technologies and digital solutions, coupled with strong interpersonal and communication skills, will further enhance your suitability for this role. At AIONEERS-EFESO, you will have the opportunity to become a thought leader in digital supply chain transformation. The company offers a conducive team culture, flexible work hours, respect for your ideas, open discussions, attractive remuneration, paid maternity and paternity leave, comprehensive insurance plans, sponsored certifications in relevant technology areas, and an office located at a prime location in Mannheim. The focus is on your results rather than hours worked, providing you with the chance to actively contribute to innovative business strategies on a global scale. Join us in reshaping supply chain management and crafting your own success story.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
About KPMG in India KPMG entities in India are professional services firm(s) affiliated with KPMG International Limited, established in India in August 1993. Our professionals leverage the global network of firms and possess in-depth knowledge of local laws, regulations, markets, and competition. With offices spread across India in cities like Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara, and Vijayawada, KPMG entities in India offer services to national and international clients across various sectors. We aim to provide rapid, performance-based, industry-focused, and technology-enabled services that demonstrate our understanding of global and local industries and our experience in the Indian business environment. Equal employment opportunity information Job Description: We are looking for a highly skilled Senior Data AWS Solutions Engineer to join our dynamic team. The ideal candidate should have substantial experience in designing and implementing data solutions on AWS, utilizing cloud services to drive business outcomes. The candidate must have hands-on experience in implementing solutions like Data Lake or involvement in technical architecture reviews and discussions. Previous knowledge of the Automobile or Banking sector would be advantageous. Key Responsibilities: - Design, develop, and implement scalable data architectures on AWS. - Collaborate with cross-functional teams to gather requirements and translate them into technical solutions. - Optimize data storage and processing using AWS services such as S3, Glue, RDS, and Lambda. - Ensure data security, compliance, and best practices in data management. - Troubleshoot and resolve data-related issues and performance bottlenecks. - Mentor junior engineers and contribute to team knowledge sharing. - Stay updated with the latest AWS services and industry trends to recommend improvements. Qualifications: - Bachelor's degree in Computer Science, Engineering, or related field. - 8+ years of experience in data engineering or solutions architecture, focusing on AWS. - Proficiency in AWS data services, ETL tools, and data modeling techniques. - Strong programming skills in Python, Pyspark Java, or similar languages. - Experience with data warehousing and big data technologies (e.g., Hadoop, Spark). - Excellent problem-solving, consulting, and analytical skills. - Strong communication and collaboration abilities. - Stakeholder Management and leading teams. Preferred Qualifications: - AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Data Analytics). - Experience with containerization (Docker, Kubernetes). - Knowledge of machine learning and data science principles. Join us to apply your expertise in AWS and data engineering to develop innovative solutions that drive our business forward.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You are a highly skilled Snowflake Developer with extensive experience in designing, implementing, and managing Snowflake-based data solutions. Your role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: - Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. - Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. - Implement Snowflake-based data warehouses, data lakes, and data integration solutions. - Manage data ingestion, transformation, and loading processes to ensure data quality and performance. - Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. - Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: - Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field. - 6+ years of experience in data architecture, data engineering, or a related field. - Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. - Exposure working in Airflow is a must. - Proven track record of contributing to data projects and working in complex environments. - Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. - Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You should have a Bachelor's degree (or above) in engineering/computer science with 5-8 years of experience in Java/Kotlin Development. You must possess advanced knowledge of object-oriented design, data architectures, and communication skills to engage with various levels in the organization. Your role will involve coordinating and participating in various activities such as analysis, scoping, design, coding, code reviews, test case reviews, defect management, implementation planning/execution, and support. You should be adept in leading resources and timelines, and removing obstacles in a timely manner. Your technical expertise should include Core Java, Kotlin, Kotlin Co-routines, Kafka, Vertx, Nodejs, JavaScript, JQuery, AJAX, and Asynchronous programming. Experience in Object-Oriented Analysis and Design, Domain-Driven Design, and Design Patterns is essential. Proficiency in tools like Maven, Jenkins, and Git is required. You should have a strong background in data-driven applications using relational database engines and/or NoSQL Databases (e.g., MySQL, Oracle, SQL Server, MongoDB, Cassandra). Excellent debugging and testing skills, along with the ability to conduct performance and scalability analysis are crucial. Stay updated on technology and product trends, including Open Source developments. Your responsibilities will include working on large-scale distributed systems, solving problems, multitasking, and collaborating effectively in a fast-paced environment. You must focus on implementing solutions that adhere to industry standards and promote reuse. Understanding system performance and scaling is key, as well as ensuring the stability, interoperability, portability, security, and scalability of Java system architecture. Your role will involve working in an agile and collaborative setup within the engineering team, emphasizing partnership capabilities and teamwork across teams. Your ability to influence others, develop consensus, and work towards common goals will be essential for success in this position.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are not the person who will settle for just any role. Neither are we. Because we're out to create Better Care for a Better World, and that takes a certain kind of person and teams who care about making a difference. Here, you'll bring your professional expertise, talent, and drive to building and managing our portfolio of iconic, ground-breaking brands. In this role, you'll help us deliver better care for billions of people around the world. It starts with YOU. In this role, you will be responsible for various key areas: M&A Strategy & Technology Due Diligence - Partner with corporate development, DTS, and business leaders to assess technology landscapes, risks, and synergies in M&A deals. - Conduct IT due diligence on target companies, evaluating enterprise systems, cloud platforms, cybersecurity, and data architectures. - Develop IT integration or separation strategies, identifying opportunities for consolidation, cost optimization, and innovation. Post-Merger Integration & IT Harmonization - Design and execute enterprise architecture roadmaps for post-merger IT integration or separation, focusing on applications, infrastructure, data, and security; ensuring alignment with business objectives and measurable progress through well-defined phases, timelines, and milestones. - Develop blueprints for standardizing platforms, ensuring interoperability, and optimizing business capabilities. - Establish frameworks for system rationalization, ensuring efficient technology consolidation while minimizing disruption. - Drive cloud migration, ERP integration, and digital transformation initiatives in alignment with business objectives. - Provide oversight and collaboration with project teams to ensure deployed solutions are in alignment with architecture standards. Governance, Compliance & Risk Management - Define architecture standards, policies, and best practices to ensure secure and compliant IT integrations. - Address regulatory, data privacy, and cybersecurity requirements during M&A transitions. - Collaborate with IT security and compliance teams to mitigate risks in technology transitions. Kimberly-Clark is known for its legendary brands such as Huggies, Kleenex, Cottonelle, and more which are used by millions of people every day. At Kimberly-Clark, you'll be part of a team committed to driving innovation, growth, and impact with a history of over 150 years of market leadership. As part of the team, you'll have the opportunity to explore new ideas and ways to achieve results while experiencing flexible work arrangements. To succeed in this role, you must have: - 7+ years of experience in enterprise architecture, - 4+ years in M&A technology strategy and integration. - Strong expertise in IT due diligence, post-merger IT integration, and technology rationalization. - Experience with cloud transformation (AWS, Azure, GCP), ERP systems (SAP, Oracle, Workday), and modern application architectures. - Deep understanding of data governance, cybersecurity, compliance, and risk management in M&A scenarios. - Proven ability to drive large-scale technology harmonization efforts across complex IT environments. - Strong leadership, stakeholder management, and communication skills, with the ability to influence executive decision-making. - Bachelor's or master's degree in computer science, Information Technology, or related field - Fluency in English Kimberly-Clark offers great benefits including support for good health, diverse income protection insurance options, flexible savings accounts, and additional programs for education, relocation, and more. If you are interested in this role, click the Apply button and complete the online application process. We look forward to reviewing your application and considering you for this exciting opportunity at Kimberly-Clark.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Architect at Diageo, you will play a crucial role in contributing to the transformation of our business capabilities through data and digital technology. You will be responsible for analyzing the overall IT landscape and various technologies to ensure seamless integration. Your expertise in data modeling, schema design, and data architectures will be essential in driving our enterprise data management, data warehouse, and business intelligence initiatives. You will have the opportunity to review data models for completeness, quality, and adherence to established architecture standards. Your strong capabilities in comparing and recommending tools and technologies will be instrumental in enhancing our data management processes. Additionally, your proficiency in metadata maintenance and data catalog management will contribute to the overall efficiency of our data systems. Preferred qualifications for this role include experience with Databricks Lakehouse architecture, expertise in working with file formats such as Parquet, ORC, AVRO, Delta, and Hudi, and exposure to CI/CD tools like Azure DevOps. Knowledge and experience with Azure data offerings will be beneficial in effectively leveraging our data resources. If you are passionate about leveraging data and technology to drive business growth and innovation, and if you thrive in a dynamic and collaborative environment, we invite you to join our team at Diageo. Your contributions will play a key role in shaping the future of our digital and technology initiatives.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As a Solution Architect with 10 to 14 years of experience, you will collaborate with sales, presales, and COE teams to provide technical expertise and support throughout the new business acquisition process. Your role is crucial in understanding customer requirements, presenting solutions, and demonstrating product value. You excel in high-pressure environments, maintaining a positive outlook and recognizing that career growth requires strategic choices. Possessing strong communication skills, both written and verbal, allows you to convey complex technical concepts clearly. Being a team player, customer-focused, self-motivated, and responsible individual, you can work under pressure with a positive attitude. Experience in managing RFPs/ RFIs, client demos, presentations, and converting opportunities into winning bids is essential. Your work ethic, positive attitude, and enthusiasm for new challenges, along with multitasking and prioritizing abilities, are key. You can work independently with minimal supervision, demonstrating a process-oriented and quality-first approach. Your performance as a Solution Architect will be measured by your ability to convert clients" business challenges into winning proposals through excellent technical solutions. In this role, you will: - Develop high-level architecture designs for scalable, secure, and robust solutions. - Select appropriate technologies, frameworks, and platforms for business needs. - Design cloud-native, hybrid, or on-premises solutions using AWS, Azure, or GCP. - Ensure seamless integration between enterprise applications, APIs, and third-party services. - Design and develop scalable, secure, and performant data architectures on cloud platforms. - Translate business needs into technical solutions by designing secure, scalable, and performant data architectures. - Recommend and implement data models, data services, and data governance practices. - Design and implement data pipelines for efficient data extraction, transformation, and loading processes. Requirements: - 10+ years of experience in data analytics and AI technologies. - Certifications in data engineering, analytics, cloud, or AI are advantageous. - Bachelor's in engineering/technology or an MCA from a reputed college is required. - Prior experience as a solution architect during the presales cycle is beneficial. Location: Hyderabad, Ahmedabad, Indore Experience: 10 to 14 years Joining Time: Maximum 30 days Work Schedule: All Days, Work from Office,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
You will be part of Boston Consulting Group, collaborating with leaders in business and society to address critical challenges and seize opportunities. BCG, a pioneer in business strategy since 1963, now specializes in total transformation, facilitating complex change, fostering organizational growth, establishing competitive advantages, and driving bottom-line impact. Success requires a combination of digital and human capabilities, and our diverse global teams offer deep industry knowledge and varied perspectives to instigate change. BCG provides solutions through cutting-edge management consulting, technology and design, corporate and digital ventures, and business purposes. Our collaborative model spans the entire organization, yielding results that empower our clients to prosper. In this role, you will design solution architecture for AI and Gen AI-enabled applications, covering a wide range of functions. This includes architecting solutions for production-grade AI applications like RAG, Call Analytics, Conversational Bots (text and voice), AI agents, and AI-based Search. You will evaluate Agentic AI platforms from hyper-scalers, AI native players, and SaaS providers, integrating them with various systems such as Dialer/Telephony, Chat platforms, Mobile/Digital Apps, and CRM applications. Your responsibilities will also involve designing data platforms for Gen AI, capable of handling unstructured and streaming data in real time across multiple modalities (text, voice, videos, images, etc.). Utilizing Gen AI across SDLC lifecycles to enhance Tech function effectiveness is a key aspect of this role. Additionally, you will design cloud-based AI applications and data architectures for AI applications, leveraging Vector DB, Graph DB, RDS, NoSQL DBs based on diverse use cases. Experience in event-driven architectures, Microservices, and Digital platforms is essential, along with a minimum of 10+ years of overall experience and specific experience in GenAI. Boston Consulting Group is an equal opportunity employer, committed to fostering diversity and inclusion in the workplace.,
Posted 2 weeks ago
8.0 - 10.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Staff Data Engineer Are you excited about the opportunity to contribute to an industry leader in Energy Technology As a Staff Data Engineer, you will play a pivotal role in ensuring the highest standards of quality in our digital transformation. Our Digital Technology Team provides top-tier products and services, and your expertise will be crucial in solving complex challenges and innovating for the future. Partner with the best As a Staff Data Engineer, you will leverage your expertise in data engineering to deliver high-quality data solutions. Your role will involve designing, developing, and maintaining data pipelines, adhering to best practices, and driving continuous improvement to achieve exceptional results. The Data Engineering team helps solve our customers toughest challenges making flights safer, power cheaper, and oil & gas production safer for people and the environment by leveraging data and analytics. The Lead Data Engineer will work with the team to create state-of-the-art data and analytics driven solutions, working across Baker Hughes to drive business analytics to a new level of predictive analytics while leveraging big data tools and technologies. As a Staff Data Engineer, you will be responsible for: Designing Data Architectures : Creating and implementing scalable data architectures that meet the organization's needs. Developing Data Pipelines : Building and maintaining robust data pipelines to ensure the efficient flow of data from various sources to end-users. Ensuring Data Quality : Implementing data validation, cleansing, and transformation processes to maintain high data quality. Data Governance : Establishing and enforcing data governance policies to ensure data integrity, security, and compliance with regulations. Collaborating with Stakeholders : Working closely with data scientists, analysts, and other stakeholders to understand their requirements and translate them into data solutions. Optimizing Data Storage : Managing and optimizing data storage solutions to ensure efficient data retrieval and processing. Monitoring and Troubleshooting : Continuously monitoring data systems for performance issues and troubleshooting any problems that arise. Fuel your passion: To be successful in this role you will require: Bachelor's degree in computer science or STEM Majors (Science, Technology, Engineering, and Math). A minimum of 8 years of professional experience in software engineering Data Engineering with Python Exposure. Technical Proficiency : Strong understanding of data engineering tools and methodologies. Analytical Skills : Excellent analytical and problem-solving abilities. Attention to Detail : Meticulous attention to detail to ensure data quality and consistency. Communication : Strong communication skills to effectively collaborate with development teams and stakeholders. Automation : Proficiency in developing and executing automated data pipelines and scripts. Documentation : Ability to create and maintain comprehensive data engineering documentation. Desired Characteristics Expertise in database management systems, programming languages (like Python, Java, or Scala), and data processing frameworks. Ability to tackle complex data challenges and develop innovative solutions to ensure efficient data processing and storage. Experience in designing and implementing scalable data architectures that meet organizational needs. Knowledge of data governance models, data quality standards, and best practices to ensure data integrity and compliance. Strong communication skills to work effectively with data scientists, analysts, and other stakeholders to understand requirements and translate them into data solutions. Ability to analyze large datasets and extract meaningful insights to support business decisions. Willingness to stay updated with the latest technologies and trends in data engineering to continuously improve systems and processes. Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns (where applicable): Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working with us Our people are at the heart of what we do. We know we are better when all our people are developed, engaged, and able to bring their completely authentic selves to work. We invest in the health and well-being of our workforce, train, reward talent, and develop leaders at all levels to bring out the best in each other. Working for you Our inventions have revolutionized energy for over a century. But to keep going forward tomorrow, we know we must push the boundaries today. We prioritize rewarding those who embrace challenges with a package that reflects how much we value their input. Join us, and you can expect: Contemporary work-life balance policies and wellbeing activities. Comprehensive private medical care options. Safety net of life insurance and disability programs. Additional elected or voluntary benefits. About Us: We are an energy technology company that provides solutions to energy and industrial customers worldwide. Built on a century of experience and conducting business in over 120 countries, our innovative technologies and services are taking energy forward - making it safer, cleaner and more efficient for people and the planet. Join Us: Are you seeking an opportunity to make a real difference in a company that values innovation and progress Join us and become part of a team of people who will challenge and inspire you! Let's come together and take energy forward. Baker Hughes Company is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law.
Posted 2 weeks ago
12.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
About Inspire Brands: Inspire Brands is looking for a Data Architect, a self-driven senior technical professional to collaborate with our business, Product and Development teams and help qualify, identify, and build Data & Analytics solutions. As part of the Enterprise Data Team, this position is responsible for actively defining robust, efficient, scalable and innovation solutions that are custom fit to our needs. Job Summary: Play a key role working with product and business teams to understand their strategic, business, and technical needs. Define technical architecture roadmaps which help realize business values in both tactical and strategic ways. Define and continuously update architecture principles in line with business and technology needs. Design solutions involving data architectures based on established principles. Architectures include but are not limited to data lake, data fabric, dimensional marts, MDM, etc. Create data flow diagrams and details to establish data lineage and help educate team members in a clear and articulate manner. Partner with technical leads to identify solution components and latest data related technologies that best meet business requirements. Identify gaps in the environment, process, and skills with the intent to address, educate, and bring teams together to drive results. Become a thought leader by educating and sharing best practices, architecture patterns, building deep relationships with technical leaders, contributing to publications, white papers, conferences, etc. Provide support to enterprise data infrastructure management and analysts to ensure development of efficient data systems utilizing established standards, procedures and methodologies. Create cloud data models and database architecture using database deployment, monitoring automation code, and for long term technical viability of cloud master data management integrated repositories. Suggest ideas to improve system performance and cost reduction. Communicate data architecture to stakeholders and collaborate and coordinate with other technical staff. Ensures adherence to Enterprise data movement, quality and accountability standards in technology. Conducts data driven analyses of usage of detailed data elements across the business domain to provide optimal data provisioning patterns across the application space. Possesses excellent quantitative / analytic skills and are able to influence strategic direction, as well as develop tactical plans. Improving and streamlining processes regarding data flow and data quality to improve data accuracy, viability and value. Education Requirements: Minimum 4 Year / Bachelor's Degree. Preferred: BS in Computer Science. EXPERIENCE QUALIFICATION: Minimum Years of Experience: 12 + Years. Must have: Strong in areas he worked on especially in DE and DG space :: DE (Data Engineering) and DG (Data Governance). Strong in Modern Warehouse and worked in a similar environment. A strong candidate to support PODS based in HSC but will need support in handling and initiating projects. Product-Oriented Delivery Structure (Agile PODS). Excellent communication skills. REQUIRED KNOWLEDGE, SKILLS or ABILITIES: Strong experience in Azure services including but not limited to ADLS, ADF, Event Hubs, Functions, etc. Exhaustive experience in Data bricks, Snowflake, Apache Airflow, etc. Exceptional interpersonal skills with the ability to communicate verbal, written, and presentation skills at all levels, across multiple functions, and drive participation and collaboration. Demonstrated ability in working with key executives and stakeholders to solve and drive business outcomes. Deep understanding of processes related to data solution SDLC with clear ability to improve processes where necessary. Strong analytical and quantitative skills. Enjoys seeing the impact of work in the organization, highly motivated and comfortable with ambiguity. Maverick approach to work with a passion for technology and ways of working trends.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
The primary purpose of this role is to lead and manage engineering teams, providing technical guidance, mentorship, and support to ensure the delivery of high-quality software solutions. As a leader, you will drive technical excellence, foster a culture of innovation, and collaborate with cross-functional teams to align technical decisions with business objectives. Your key responsibilities will include leading engineering teams effectively, fostering a collaborative and high-performance culture to achieve project goals and meet organizational objectives. You will oversee timelines, team allocation, risk management, and task prioritization to ensure the successful delivery of solutions within scope, time, and budget. Additionally, you will mentor and support team members" professional growth, conduct performance reviews, provide actionable feedback, and identify opportunities for improvement. Furthermore, you will be responsible for evaluating and enhancing engineering processes, tools, and methodologies to increase efficiency, streamline workflows, and optimize team productivity. Collaboration with business partners, product managers, designers, and other stakeholders is crucial to translate business requirements into technical solutions and ensure a cohesive approach to product development. Enforcement of technology standards, facilitation of peer reviews, and implementation of robust testing practices are essential to ensure the delivery of high-quality solutions. As part of the role's director expectations, you will provide expert advice to senior functional management and committees, influencing decisions made outside of your function and offering significant input to function-wide strategic initiatives. You will manage, coordinate, and enable resourcing, budgeting, and policy creation for a significant sub-function. It is critical to escalate breaches of policies or procedures appropriately and foster compliance while ensuring relevant regulations are observed and processes are in place to facilitate adherence. Moreover, you are expected to focus on the external environment, regulators, or advocacy groups to monitor and influence on behalf of the organization when necessary. Demonstrating extensive knowledge of how the function integrates with the business division to achieve overall business objectives is essential. You must maintain comprehensive knowledge of industry theories and practices within your discipline, alongside up-to-date sector-specific knowledge and insight into external market developments and initiatives. Utilize interpretative thinking and advanced analytical skills to solve problems and design solutions in complex or sensitive situations. Moving on to the second purpose of the role, which is to build and maintain systems that collect, store, process, and analyze data to ensure accuracy, accessibility, and security. Your responsibilities will include constructing and maintaining data architecture pipelines for transferring and processing durable, complete, and consistent data. Designing and implementing data warehouses and data lakes that manage appropriate data volumes, velocity, and adhere to required security measures is crucial. Developing processing and analysis algorithms suitable for the intended data complexity and volumes, as well as collaborating with data scientists to build and deploy machine learning models, are also key aspects of this role. In conclusion, as a leader in this role, you are expected to exhibit a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The LEAD behaviors of Listening and being authentic, Energizing and inspiring, Aligning across the enterprise, and Developing others are fundamental to guiding your team towards success. Additionally, demonstrating the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, alongside the Barclays Mindset of Empowering, Challenging, and Driving, will be essential in upholding the organization's moral compass and behavioral expectations.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer III at JPMorgan Chase within the Consumer and Community Banking, you will be an integral part of a dynamic team aimed at designing and delivering reliable data solutions. Your role will involve developing, testing, and maintaining essential data pipelines and architectures to support the firm's business objectives. You will work in an agile environment across diverse technical areas within various business functions. Your responsibilities will include supporting the review of controls to ensure the protection of enterprise data, making custom configuration changes in tools to meet business or customer requests, updating data models based on new use cases, and utilizing SQL and NoSQL databases for data analysis. Additionally, you will contribute to fostering a team culture centered around diversity, opportunity, inclusion, and respect. To excel in this role, you are required to have formal training or certification in software engineering concepts along with a minimum of 3 years of applied experience. You should possess expertise in SQL, including joins and aggregations, as well as a working understanding of NoSQL databases. Proficiency in statistical data analysis, the ability to select appropriate tools for analysis, and experience in customizing changes in tools to generate products are essential skills for this position.,
Posted 3 weeks ago
7.0 - 9.0 years
0 Lacs
india
On-site
DESCRIPTION Come build the next generation of technology and driver experiences for Amazon's Last Mile delivery business. We build services and apps that drivers use to deliver most Amazon customer packages, with a reach of millions of customers and packages. The Driver Experience team is responsible for how drivers know where to go, what packages to deliver, where and how to pickup packages, and how to communicate with customers. This TPM role not only helps define the driver experience, it also defines the technology platform used to build Amazon's delivery app. We're seeking a Senior Technical Program Manager to manage large, cross functional programs and projects that support Last Mile delivery. In this role you will be responsible for delivering complex projects end-to-end keeping Amazon's scale and quality in mind. You will help deliver mobile apps on multiple platforms, and the backend services needed to provide delivery and task guidance to on-road drivers. Amazon Last Mile is a large, service-oriented architecture with thousands of components that work together to provide a single app experience. As a Senior TPM, you will drive the scalable architecture of our next-generation platform, design for technical operational excellence and package delivery operational cost efficiency. You will also optimize for driver productivity and quality. You will anticipate project bottlenecks, provide escalation management, anticipate and make tradeoffs, and balance the business needs versus technical constraints. There are very few opportunities to design a system that operates at Amazon scale and throughput, so this is a highly visible position that will have a material impact on Amazon transportation for decades to come. About the team Amazon's Last Mile Driver Experience team is directly responsible for the driver experience. We work with technologies ranging from mobile devices, to in-dash screens, to backend supporting services, to hardware devices in warehouse pickup points, to enable deliveries worldwide and scale, with region-specific support. We think not only about mobile apps, we also think about driver safety, how drivers get paid, how delivery partners support their drivers, and more. We regularly partner with Amazon business teams to deliver end to end business programs that surface to drivers through the driver experience screens and systems. We also define and deliver the technical platform for building driver experiences, ranging from data architectures to mobile frameworks. BASIC QUALIFICATIONS - 7+ years of working directly with engineering teams experience - 5+ years of technical product or program management experience - 5+ years of technical program management working directly with software engineering teams experience - Experience managing programs across cross functional teams, building processes and coordinating release schedules - Bachelor's degree in engineering, computer science or equivalent - Excellent oral and written communication skills, as well as the ability to think clearly, analyze quantitatively, problem-solve, scope technical requirements and prioritize tasks PREFERRED QUALIFICATIONS - 5+ years of project management disciplines including scope, schedule, budget, quality, along with risk and critical path management experience - Experience managing projects across cross functional teams, building sustainable processes and coordinating release schedules - Experience defining KPI's/SLA's used to drive multi-million dollar businesses and reporting to senior leadership - Experience leading highly technical programs involving infrastructure, complex systems, systems engineering Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.
Posted 3 weeks ago
10.0 - 12.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About the Company Brillio Technologies is a forward-thinking company dedicated to leveraging data to drive innovation and deliver impactful solutions. Our mission is to empower businesses through data-centric strategies, fostering a culture of collaboration and excellence. About the Role We&aposre looking for a Senior Product & Delivery Manager with a strong background in data-driven products, agile development, and end-to-end delivery management. This is a high-impact role driving product innovation and execution across complex technology and business landscapes. If you&aposre passionate about building data-centric solutions and love working across cross-functional teamsthis is your opportunity. Responsibilities Lead full project & delivery lifecycle: requirements, scope, planning, budgeting, resourcing, and stakeholder engagement Drive product management activities including backlog grooming, use-case capture, and roadmap definition Apply Design Thinking, user journey mapping, and value stream analysis to shape new features Collaborate closely with tech & business teams to align development with strategy Champion Agile/Scrum methodologies for effective MVP and product rollout Define and track KPIs and customer value metrics for data-driven prioritization Guide execution of data initiatives: lineage, metadata management, data catalogs, data governance, and more Bridge business needs and technical solutions with clarity and confidence Qualifications 10+ years in product and delivery management in tech-centric environments 6+ years in hands-on product management and data/business analysis within Agile settings Deep experience with data architectures and governance tools (lineage, metadata, catalogs, taxonomies) Strong technical acumen + proven program management skills Confident communicatorequally comfortable with dev teams and senior stakeholders Ability to drive UX and data visualization solutions that empower users Creative, analytical, and proactive mindset Domain knowledge in Banking or Capital Markets Familiarity with regulations like BCBS 239, IFRS, CCAR Experience in knowledge graph development or information management Required Skills Strong background in data-driven products Agile development experience End-to-end delivery management Preferred Skills Experience with data architectures and governance tools Domain knowledge in Banking or Capital Markets Familiarity with regulations like BCBS 239, IFRS, CCAR Full-time | Senior Level Equal Opportunity Statement Brillio Technologies is committed to diversity and inclusivity in the workplace. We encourage applications from individuals of all backgrounds and experiences. Show more Show less
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Product Manager at Google, you will play a crucial role in shaping products that impact millions of users worldwide. Your responsibilities will involve guiding products from ideation to launch by bridging the gap between technical aspects and business objectives. Your ability to break down complex problems into actionable steps will be key in driving product development and innovation. You will collaborate cross-functionally with various teams including engineering, UX/UI, sales, and finance to ensure the successful development and launch of data platforms that drive AI/ML innovation. Your role will involve defining and delivering product roadmaps, leading the discovery and development of new features, and translating business requirements into technical specifications. Your expertise in software development and experience in working with cloud-based data platforms will be valuable assets in this role. Additionally, your understanding of AI/ML concepts such as machine learning algorithms, data architectures, and data pipelines will enable you to contribute effectively to the development of cutting-edge technologies. As part of the Product Management team at Google, you will be responsible for conducting market research, identifying trends, and collaborating with stakeholders to tailor solutions to meet user needs. Your ability to communicate effectively with engineers, data scientists, executives, and customers will be essential in driving product success and ensuring alignment with business goals.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
coimbatore, tamil nadu
On-site
About Mindcurv (Part of Accenture Song) Mindcurv, a part of Accenture Song, assists customers in reimagining their digital business, experiences, and technology to thrive in the new digital landscape. By crafting sustainable and accountable solutions for individuals residing in a digital world, Mindcurv addresses the market's imperative to digitalize business processes and enhance customer experiences while capitalizing on cloud technologies through the adoption of DevOps and agile methodologies. Your Role As a Senior Data Engineer at Mindcurv, you will play a key role within our expanding Data and Analytics team. We are looking for a candidate with extensive proficiency in Databricks and cloud data warehousing, who has a proven history of developing scalable data pipelines, optimizing data architectures, and facilitating robust analytics capabilities. In this position, you will collaborate with diverse teams to ensure that the organization harnesses data as a strategic asset. Your responsibilities will include: - Designing, constructing, and managing scalable data pipelines and ETL processes using Databricks and other contemporary tools. - Architecting, implementing, and overseeing cloud-based data warehousing solutions on Databricks, following the Lakehouse Architecture model. - Creating and maintaining optimized data lake architectures to sustain advanced analytics and machine learning applications. - Engaging with stakeholders to elicit requirements, devise solutions, and ensure the delivery of high-quality data. - Enhancing data pipelines for improved performance and cost efficiency. - Enforcing best practices for data governance, access control, security, and compliance in the cloud. - Monitoring and troubleshooting data pipelines to ensure their reliability and accuracy. - Mentoring junior engineers to cultivate a culture of continuous learning and innovation. - Demonstrating excellent communication skills and the ability to collaborate effectively with clients primarily based in Western Europe. Who You Are To excel in this role, you should possess the following qualifications: - Bachelor's or Master's degree in Computer Science or a related field. - At least 5 years of experience in data engineering roles focused on cloud platforms. - Proficiency in Databricks, encompassing Spark-based ETL, Delta Lake, and SQL. - Solid experience with a major cloud platform (preferably AWS). - Hands-on experience with Delta Lake, Unity Catalog, and concepts related to Lakehouse architecture. - Strong programming skills in Python and SQL, with experience in Pyspark considered a plus. - Sound understanding of data modeling principles, including star schema and dimensional modeling. - Familiarity with CI/CD practices, version control systems like Git, and data governance and security standards such as GDPR and CCPA compliance. Nice-to-Have Qualifications Additionally, the following qualifications would be advantageous: - Experience with Airflow or similar workflow orchestration tools. - Exposure to machine learning workflows and MLOps. - Certification in Databricks and AWS. - Familiarity with data visualization tools like Power BI. What Do We Offer You At Mindcurv, we provide various perks such as refreshments, team events, and attractive compensation packages. Apart from these, we offer intellectually stimulating projects involving cutting-edge technologies, an agile and entrepreneurial atmosphere devoid of office politics, work-life balance, transparent culture, and a management team that values feedback. Our High Performers Individuals who excel at Mindcurv are self-starters, team players, and continuous learners who thrive in ambiguous situations. We empower our employees with the necessary resources for success, encourage exploration within their domain, and offer continuous growth opportunities to enrich their careers. Ready for Change If you are prepared for the next phase in your career - a role that allows you to be authentic and bring out the best in yourself, your colleagues, and clients - do not hesitate to apply for this job opportunity now. Join us at Mindcurv and embark on a journey of professional growth and fulfillment. (Note: The Job Description has been summarized and rephrased for clarity and readability.),
Posted 1 month ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
This is a data engineer position where you will be responsible for designing, developing, implementing, and maintaining data flow channels and data processing systems to support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. Your main objective will be to define optimal solutions for data collection, processing, and warehousing, particularly within the banking & finance domain. You must have expertise in Spark Java development for big data processing, Python, and Apache Spark. You will be involved in designing, coding, and testing data systems and integrating them into the internal infrastructure. Your responsibilities will include ensuring high-quality software development with complete documentation, developing and optimizing scalable Spark Java-based data pipelines, designing and implementing distributed computing solutions for risk modeling, pricing, and regulatory compliance, ensuring efficient data storage and retrieval using Big Data, implementing best practices for Spark performance tuning, maintaining high code quality through testing, CI/CD pipelines, and version control, working on batch processing frameworks for Market risk analytics, and promoting unit/functional testing and code inspection processes. You will also collaborate with business stakeholders, Business Analysts, and other data scientists to understand and interpret complex datasets. Qualifications: - 5-8 years of experience in working in data ecosystems - 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting, and other Big data frameworks - 3+ years of experience with relational SQL and NoSQL databases such as Oracle, MongoDB, HBase - Strong proficiency in Python and Spark Java with knowledge of core Spark concepts (RDDs, Dataframes, Spark Streaming, etc.), Scala, and SQL - Data integration, migration, and large-scale ETL experience - Data modeling experience - Experience building and optimizing big data pipelines, architectures, and datasets - Strong analytic skills and experience working with unstructured datasets - Experience with various technologies like Confluent Kafka, Redhat JBPM, CI/CD build pipelines, Git, BitBucket, Jira, external cloud platforms, container technologies, and supporting frameworks - Highly effective interpersonal and communication skills - Experience with software development life cycle Education: - Bachelors/University degree or equivalent experience in computer science, engineering, or a similar domain This is a full-time position in the Data Architecture job family group within the Technology sector.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Engineer at GlobalLogic, you will be responsible for architecting, building, and maintaining complex ETL/ELT pipelines for batch and real-time data processing using various tools and programming languages. Your key duties will include optimizing existing data pipelines for performance, cost-effectiveness, and reliability, as well as implementing data quality checks, monitoring, and alerting mechanisms to ensure data integrity. Additionally, you will play a crucial role in ensuring data security, privacy, and compliance with relevant regulations such as GDPR and local data laws. To excel in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. Excellent analytical, problem-solving, and critical thinking skills with meticulous attention to detail are essential. Strong communication (written and verbal) and interpersonal skills are also required, along with the ability to collaborate effectively with cross-functional teams. Experience with Agile/Scrum development methodologies is considered a plus. Your responsibilities will involve providing technical leadership and architecture by designing and implementing robust, scalable, and efficient data architectures that align with organizational strategy and future growth. You will define and enforce data engineering best practices, evaluate and recommend new technologies, and oversee the end-to-end data development lifecycle. As a leader, you will mentor and guide a team of data engineers, conduct code reviews, provide feedback, and promote a culture of engineering excellence. You will collaborate closely with data scientists, data analysts, software engineers, and business stakeholders to understand data requirements and translate them into technical solutions. Your role will also involve communicating complex technical concepts and data strategies effectively to both technical and non-technical audiences. At GlobalLogic, we offer a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust environment. By joining our team, you will have the chance to work on impactful projects, engage your curiosity and problem-solving skills, and contribute to shaping cutting-edge solutions that redefine industries. With a commitment to integrity and trust, GlobalLogic provides a safe, reliable, and ethical global environment where you can thrive both personally and professionally.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, you will be part of a team of innovative professionals working with cutting-edge technologies. Our purpose is anchored in bringing real positive changes in an increasingly virtual world, transcending generational gaps and future disruptions. We are currently seeking SQL Professionals for the role of Data Engineer with 4-6 years of experience. The ideal candidate must have a strong academic background. As a Data Engineer at BNY Mellon in Pune, you will be responsible for designing, developing, and maintaining scalable data pipelines and ETL processes using Apache Spark and SQL. You will collaborate with data scientists and analysts to understand data requirements, optimize and query large datasets, ensure data quality and integrity, implement data governance and security best practices, participate in code reviews, and troubleshoot data-related issues promptly. Qualifications for this role include 4-6 years of experience in data engineering, proficiency in SQL and data processing frameworks like Apache Spark, knowledge of database technologies such as SQL Server or Oracle, experience with cloud platforms like AWS, Azure, or Google Cloud, familiarity with data warehousing solutions, understanding of Python, Scala, or Java for data manipulation, excellent analytical and problem-solving skills, and good communication skills to work effectively in a team environment. Joining YASH means being empowered to shape your career in an inclusive team environment. We offer career-oriented skilling models and promote continuous learning, unlearning, and relearning at a rapid pace. Our workplace is based on four principles: flexible work arrangements, free spirit, and emotional positivity; agile self-determination, trust, transparency, and open collaboration; all support needed for the realization of business goals; and stable employment with a great atmosphere and ethical corporate culture.,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Azure Databricks Engineer who will be responsible for designing, developing, and maintaining scalable data pipelines and supporting data infrastructure in an Azure cloud environment. Your key responsibilities will include designing ETL pipelines using Azure Databricks, building robust data architectures on Azure, collaborating with stakeholders to define data requirements, optimizing data pipelines for performance and reliability, implementing data transformations and cleansing processes, managing Databricks clusters, and leveraging Azure services for data orchestration and storage. You must possess 5-10 years of experience in data engineering or a related field with extensive hands-on experience in Azure Databricks and Apache Spark. Strong knowledge of Azure cloud services such as Azure Data Lake, Data Factory, Azure SQL, and Azure Synapse Analytics is required. Experience with Python, Scala, or SQL for data manipulation, ETL frameworks, Delta Lake, Parquet formats, Azure DevOps, CI/CD pipelines, big data architecture, and distributed systems is essential. Knowledge of data modeling, performance tuning, and optimization of big data solutions is expected, along with problem-solving skills and the ability to work in a collaborative environment. Preferred qualifications include experience with real-time data streaming tools, Azure certifications, machine learning frameworks, integration with Databricks, and data visualization tools like Power BI. A bachelor's degree in Computer Science, Data Engineering, Information Technology, or a related field is required for this role.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |