Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
kolkata, west bengal
On-site
The role requires you to design and implement data modeling solutions using relational, dimensional, and NoSQL databases. You will closely collaborate with data architects to create customized databases utilizing a blend of conceptual, physical, and logical data models. As a data modeler, your responsibilities include designing, implementing, and documenting data architecture and modeling solutions across various database types to support enterprise information management, business intelligence, machine learning, and data science initiatives. Your key responsibilities will involve implementing business and IT data requirements by devising new data strategies and designs for different data platforms and tools. You will engage with business and application teams to execute data strategies, establish data flows, and develop conceptual, logical, and physical data models. Moreover, you will define and enforce data modeling standards, tools, and best practices, while also identifying architecture, infrastructure, data interfaces, security considerations, analytic models, and data visualization aspects. Additionally, you will undertake hands-on tasks such as modeling, design, configuration, installation, performance tuning, and sandbox proof of concept. It is crucial to work proactively and independently to fulfill project requirements, communicate challenges effectively, and mitigate project delivery risks. Qualifications required for this role include a BE/B.Tech degree or equivalent and a minimum of 8 years of hands-on experience in relational, dimensional, and/or analytic domains, involving RDBMS, dimensional, NoSQL platforms, and data ingestion protocols. Proficiency in data warehouse, data lake, and big data platforms within multi-data-center environments is essential. Familiarity with metadata management, data modeling tools (e.g., Erwin, ER Studio), and team management skills are also necessary. Your primary skills should encompass developing conceptual, logical, and physical data models, implementing RDBMS, ODS, data marts, and data lakes, and ensuring optimal data query performance. You will be responsible for expanding existing data architecture and employing best practices. The ability to work both independently and collaboratively is vital for this role. Preferred skills for this position include experience with data modeling tools and methodologies, as well as strong analytical and problem-solving abilities.,
Posted 1 day ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Data Engineer, you will be responsible for developing data inbound and outbound patterns using Oracle OnPrem technology. Your key tasks will include ensuring data quality and integrity throughout the data lifecycle, proficiently using SQL for data extraction, transformation, and loading (ETL), monitoring and analyzing data workflows for bottlenecks and failures, optimizing data processing workflows for performance and efficiency, and integrating data into the Data Lake following architectural standards. Moreover, you will be expected to document and standardize data processes and workflows. In addition to the mandatory skills, having experience in implementing architectural best practices for data patterns and process design, providing technical support and training to internal stakeholders on data processing, and familiarity with process automation or tooling to enhance data workflows will be advantageous. The ideal candidate for this position should have 6-9 years of experience in the field. If you are passionate about data engineering and possess the required skills and experience, we encourage you to apply for this opportunity.,
Posted 1 day ago
12.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Data Architect with over 12 years of expertise in data architecture, data engineering, and enterprise-scale data solutions. Your strong background in Microsoft Fabric Data Engineering, Azure Synapse, Power BI, and Data Lake will be instrumental in driving strategic data initiatives for our organization in Hyderabad, India. In this role, you will design and implement scalable, secure, and high-performance data architecture solutions utilizing Microsoft Fabric and related Azure services. Your responsibilities will include defining data strategies aligned with business goals, architecting data pipelines and warehouses, collaborating with stakeholders to define data requirements, and providing technical leadership in data engineering best practices. Your qualifications include 12+ years of experience in data engineering or related roles, proven expertise in Microsoft Fabric and Azure Data Services, hands-on experience in modern data platform design, proficiency in SQL, Python, Spark, and Power BI, as well as strong problem-solving and communication skills. Preferred qualifications include Microsoft certifications, experience with DevOps and CI/CD for data projects, exposure to real-time streaming and IoT data, and prior Agile/Scrum environment experience. If you are passionate about driving innovation in data architecture, optimizing data performance, and leading data initiatives that align with business objectives, we encourage you to apply for this Full-Time Data Architect position in Hyderabad, India.,
Posted 1 day ago
2.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Principal Data Engineer at Brillio, you will play a key role in leveraging your expertise in Data Modeling, particularly with tools like ER Studio and ER Win. Brillio, known for its digital technology services and partnership with Fortune 1000 companies, is committed to transforming disruptions into competitive advantages through innovative digital solutions. With over 10 years of IT experience and at least 2 years of hands-on experience in Snowflake, you will be responsible for building and maintaining data models that support the organization's Data Lake/ODS, ETL processes, and data warehousing needs. Your ability to collaborate closely with clients to deliver physical and logical model solutions will be critical to the success of various projects. In this role, you will demonstrate your expertise in Data Modeling concepts at an advanced level, with a focus on modeling in large volume-based environments. Your experience with tools like ER Studio and your overall understanding of database technologies, data warehouses, and analytics will be essential in designing and implementing effective data models. Additionally, your strong skills in Entity Relationship Modeling, knowledge of database design and administration, and proficiency in SQL query development will enable you to contribute to the design and optimization of data structures, including Star Schema design. Your leadership abilities and excellent communication skills will be instrumental in leading teams and ensuring the successful implementation of data modeling solutions. While experience with AWS ecosystems is a plus, your dedication to staying at the forefront of technological advancements and your passion for delivering exceptional client experiences will make you a valuable addition to Brillio's team of "Brillians." Join us in our mission to create innovative digital solutions and make a difference in the world of technology.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As an Azure Data Engineer with expertise in Microsoft Fabric and modern data platform components, you will be responsible for designing, developing, and managing end-to-end data pipelines on Azure Cloud. Your primary focus will be on ensuring performance, scalability, and delivering business value through efficient data solutions. You will collaborate with various teams to define data requirements, implement data ingestion, transformation, and modeling pipelines supporting structured and unstructured data. Additionally, you will work with Azure Synapse, Data Lake, Data Factory, Databricks, and Power BI for seamless data integration and reporting. Your role will involve optimizing data performance and cost through efficient architecture and coding practices, ensuring data security, privacy, and compliance with organizational policies. Monitoring, troubleshooting, and improving data workflows for reliability and performance will also be part of your responsibilities. To excel in this role, you should have 5 to 7 years of experience as a Data Engineer, with at least 2+ years working on the Azure Data Stack. Hands-on experience with Microsoft Fabric, Azure Synapse Analytics, Data Factory, Data Lake, SQL Server, and Power BI integration is crucial. Strong skills in data modeling, ETL/ELT design, and performance tuning are required, along with proficiency in SQL and Python/PySpark scripting. Experience with CI/CD pipelines and DevOps practices for data solutions, understanding of data governance, security, and compliance frameworks, as well as excellent communication, problem-solving, and stakeholder management skills are essential for success in this role. A Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field is preferred. Having Microsoft Azure Data Engineer Certification (DP-203), experience in Real-Time Streaming (e.g., Azure Stream Analytics or Event Hub), and exposure to Power BI semantic models and direct lake mode in Microsoft Fabric would be advantageous. Join us to work with the latest in Microsoft's modern data stack - Microsoft Fabric, collaborate with a team of passionate data professionals, work on enterprise-grade, large-scale data projects, experience a fast-paced, learning-focused work environment, and have immediate visibility and impact in key business decisions.,
Posted 2 days ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
Your Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL). Collaborate with solution teams and Data Architects to implement data strategies, build data flows, and develop logical/physical data models. Work with Data Architects to define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Engage in hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Proactively and independently address project requirements and articulate issues/challenges to reduce project delivery risks. Your Profile Bachelor's degree in computer/data science technical or related experience. Possess 7+ years of hands-on relational, dimensional, and/or analytic experience utilizing RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols. Demonstrated experience with data warehouse, Data Lake, and enterprise big data platforms in multi-data-center contexts. Proficient in metadata management, data modeling, and related tools (e.g., Erwin, ER Studio). Preferred experience with services in Azure/Azure Databricks (Azure Data Factory, Azure Data Lake Storage, Azure Synapse & Azure Databricks) and working on SAP Datasphere is a plus. Experience in team management, communication, and presentation. Understanding of agile delivery methodology and experience working in a scrum environment. Ability to translate business needs into data vault and dimensional data models supporting long-term solutions. Collaborate with the Application Development team to implement data strategies, create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Maintain logical and physical data models along with corresponding metadata. Develop best practices for standard naming conventions and coding practices to ensure data model consistency. Recommend opportunities for data model reuse in new environments. Perform reverse engineering of physical data models from databases and SQL scripts. Evaluate data models and physical databases for variances and discrepancies. Validate business data objects for accuracy and completeness. Analyze data-related system integration challenges and propose appropriate solutions. Develop data models according to company standards. Guide System Analysts, Engineers, Programmers, and others on project limitations and capabilities, performance requirements, and interfaces. Review modifications to existing data models to improve efficiency and performance. Examine new application design and recommend corrections as needed. #IncludingYou Diversity, equity, inclusion, and belonging are cornerstones of ADM's efforts to continue innovating, driving growth, and delivering outstanding performance. ADM is committed to attracting and retaining a diverse workforce and creating welcoming, inclusive work environments that enable every ADM colleague to feel comfortable, make meaningful contributions, and grow their career. ADM values the unique backgrounds and experiences that each person brings to the organization, understanding that diversity of perspectives makes us stronger together. For more information regarding ADM's efforts to advance Diversity, Equity, Inclusion & Belonging, please visit the website: Diversity, Equity and Inclusion | ADM. About ADM At ADM, the power of nature is unlocked to provide access to nutrition worldwide. With industry-advancing innovations, a comprehensive portfolio of ingredients and solutions catering to diverse tastes, and a commitment to sustainability, ADM offers customers an edge in addressing nutritional challenges. As a global leader in human and animal nutrition and the premier agricultural origination and processing company worldwide, ADM's capabilities in insights, facilities, and logistical expertise are unparalleled. From ideation to solution, ADM enriches the quality of life globally. Learn more at www.adm.com.,
Posted 2 days ago
2.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Principal Data Engineer at Brillio, you will play a crucial role in leveraging your expertise in data modeling, particularly with tools like ER Studio and ER Win. Your specialization lies in transforming disruptive technologies into a competitive advantage for Fortune 1000 companies through innovative digital adoption. Brillio prides itself on being a rapidly growing digital technology service provider that excels in integrating cutting-edge digital skills with a client-centric approach. As a preferred employer, Brillio attracts top talent by offering opportunities to work on exclusive digital projects and engage with groundbreaking technologies. To excel in this role, you should have over 10 years of IT experience, with a minimum of 2 years dedicated to Snowflake and hands-on modeling experience. Your proficiency in Data Lake/ODS, ETL concepts, and data warehousing principles will be critical in delivering effective solutions to clients. Collaboration with clients to drive both physical and logical model solutions will be a key aspect of your responsibilities. Your technical skills should encompass advanced data modeling concepts, experience in modeling large volumes of data, and proficiency in tools like ER Studio. A solid grasp of database concepts, including data warehouses, reporting, and analytics, will be essential. Familiarity with platforms like SQLDBM and expertise in entity relationship modeling will further strengthen your profile. Moreover, your communication skills should be excellent, enabling you to lead teams effectively and facilitate seamless collaboration with clients. While exposure to AWS ecosystems is a plus, your ability to design and administer databases, develop SQL queries for analysis, and implement data modeling for star schema designs will be instrumental in your success as a Principal Data Engineer at Brillio.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior Data Engineering Architect at Iris Software, you will play a crucial role in leading enterprise-level data engineering projects on public cloud platforms like AWS, Azure, or GCP. Your responsibilities will include engaging with client managers to understand their business needs, conceptualizing solution options, and finalizing strategies with stakeholders. You will also be involved in team building, delivering Proof of Concepts (PoCs), and enhancing competencies within the organization. Your role will focus on building competencies in Data & Analytics, including Data Engineering, Analytics, Data Science, AI/ML, and Data Governance. Staying updated with the latest tools, best practices, and trends in the Data and Analytics field will be essential to drive innovation and excellence in your work. To excel in this position, you should hold a Bachelor's or Master's degree in a Software discipline and have extensive experience in Data architecture and implementing large-scale Data Lake/Data Warehousing solutions. Your background in Data Engineering should demonstrate leadership in solutioning, architecture, and successful project delivery. Strong communication skills in English, both written and verbal, are essential for effective collaboration with clients and team members. Proficiency in tools such as AWS Glue, Redshift, Azure Data Lake, Databricks, Snowflake, and databases, along with programming skills in Spark, Spark SQL, PySpark, and Python, are mandatory competencies for this role. Joining Iris Software offers a range of perks and benefits designed to support your financial, health, and overall well-being. From comprehensive health insurance and competitive salaries to flexible work arrangements and continuous learning opportunities, we are dedicated to providing a supportive and rewarding work environment where your success and happiness are valued. If you are inspired to grow your career in Data Engineering and thrive in a culture that values talent and personal growth, Iris Software is the place for you. Be part of a dynamic team where you can be valued, inspired, and encouraged to be your best professional and personal self.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
delhi
On-site
As a Partner Solution Engineer at Snowflake, you will play a crucial role in technically onboarding and enabling partners to re-platform their Data and AI applications onto the Snowflake AI Data Cloud. Collaborating with partners to develop Snowflake solutions in customer engagements, you will work with them to create assets and demos, build hands-on POCs, and pitch Snowflake solutions. Additionally, you will assist Solution Providers/Practice Leads with the technical strategies that enable them to sell their offerings on Snowflake. Your responsibilities will include keeping partners up to date on key Snowflake product updates and future roadmaps to help them represent Snowflake to their clients about the latest technology solutions and benefits. Running technical enablement programs to provide best practices and solution design workshops to help partners create effective solutions will also be part of your role. Success in this position will require you to drive strategic engagements by quickly grasping new concepts and articulating their business value. You will showcase the impact of Snowflake through compelling customer success stories and case studies, demonstrating a strong understanding of how partners make revenue through the industry priorities and complexities they face. Preferred skill sets and experiences for this role include having a total of 10+ years of relevant experience, experience working with Tech Partners, ISVs, and System Integrators (SIs) in India, and developing data domain thought leadership within the partner community. You should also have presales or hands-on experience with Data Warehouse, Data Lake, or Lakehouse platforms, as well as experience with partner integration ecosystems like Alation, FiveTran, Informatica, dbtCloud, etc. Having hands-on experience and strong knowledge of Docker and how to containerize Python-based applications, knowledge of Container networking and Kubernetes, and proficiency in Agile development practices and Continuous Integration/Continuous Deployment (CI/CD), including DataOps and MLops are desirable skills. Experience in the AI/ML domain is a plus. Snowflake is rapidly expanding, and as part of the team, you will help enable and accelerate the company's growth. If you share Snowflake's values, challenge ordinary thinking, and push the pace of innovation while building a future for yourself and Snowflake, this role could be the perfect fit for you. Please visit the Snowflake Careers Site for salary and benefits information if the job is located in the United States.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don't pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform, and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices. Evaluates new and current technologies using existing data architecture standards and frameworks. Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors. Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others. Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes. Serves as a function-wide subject matter expert in one or more areas of focus. Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle. Influences peers and project decision-makers to consider the use and application of leading-edge technologies. Advises junior architects and technologists. Required qualifications, capabilities, and skills: - Formal training or certification on software engineering concepts and 5+ years of applied experience. - Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.). - Practical cloud-based data architecture and deployment experience, preferably AWS. - Practical SQL development experiences in cloud-native relational databases, e.g. Snowflake, Athena, Postgres. - Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical, and physical data models deployed as operational vs. analytical data stores. - Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing. - Ability to tackle design and functionality problems independently with little to no oversight. - Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture. Preferred qualifications, capabilities, and skills: - Financial services experience, card and banking a big plus. - Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. - Practical experience in data mesh and/or data lake. - Practical experience in machine learning/AI with Python development a big plus. - Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin. - Knowledge of architecture assessments frameworks, e.g. Architecture Trade-off Analysis.,
Posted 3 days ago
2.0 - 9.0 years
0 Lacs
karnataka
On-site
We are seeking a Data Architect / Sr. Data and Pr. Data Architects to join our team. In this role, you will be involved in a combination of hands-on contribution, customer engagement, and technical team management. As a Data Architect, your responsibilities will include designing, architecting, deploying, and maintaining solutions on the MS Azure platform using various Cloud & Big Data Technologies. You will be managing the full life-cycle of Data Lake / Big Data solutions, starting from requirement gathering and analysis to platform selection, architecture design, and deployment. It will be your responsibility to implement scalable solutions on the Cloud and collaborate with a team of business domain experts, data scientists, and application developers to develop Big Data solutions. Moreover, you will be expected to explore and learn new technologies for creative problem solving and mentor a team of Data Engineers. The ideal candidate should possess strong hands-on experience in implementing Data Lake with technologies such as Data Factory (ADF), ADLS, Databricks, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Additionally, experience with big data technologies like Hadoop (CDH or HDP), Spark, Airflow, NiFi, Kafka, Hive, HBase, MongoDB, Neo4J, Elastic Search, Impala, Sqoop, etc., is required. Proficiency in programming and debugging skills in Python and Scala/Java is essential, with experience in building REST services considered beneficial. Candidates should also have experience in supporting BI and Data Science teams in consuming data in a secure and governed manner, along with a good understanding of using CI/CD with Git, Jenkins / Azure DevOps. Experience in setting up cloud-computing infrastructure solutions, hands-on experience/exposure to NoSQL Databases, and Data Modelling in Hive are all highly valued. Applicants should have a minimum of 9 years of technical experience, with at least 5 years on MS Azure and 2 years on Hadoop (CDH/HDP).,
Posted 3 days ago
8.0 - 13.0 years
20 - 35 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
Datawarehouse Database Architect - Immediate hiring. We are currently looking for Datawarehouse Database Architect for our client who are into Fintech solutions. Please let us know your interest and availability Experience: 10 plus years of experience Locations: Hybrid Any Accion offices in India pref (Bangalore /Pune/Mumbai) Notice Period: Immediate – 0 – 15 days joiners are preferred Required skills: Tools & Technologies Cloud Platform : Azure (Data Bricks, DevOps, Data factory, azure synapse Analytics, Azure SQL, blob storage, Databricks Delta Lake) Languages : Python/PL/SQL/SQL/C/C++/Java Databases : Snowflake/ MS SQL Server/Oracle Design Tools : Erwin & MS Visio. Data warehouse tools : SSIS, SSRS, SSAS. Power Bi, DBT, Talend Stitch, PowerApps, Informatica 9, Cognos 8, OBIEE. Any cloud exp is good to have Let’s connect for more details. Please write to me at mary.priscilina@accionlabs.com along with your cv and with the best contact details to get connected for a quick discussion. Regards, Mary Priscilina
Posted 3 days ago
3.0 - 8.0 years
11 - 15 Lacs
Bengaluru
Work from Office
The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant.The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR.At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems.Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership withData Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills. #LI-VGA1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 3 days ago
5.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Design, develop, and maintain ETL processes using Pentaho Data Integration (Kettle) . Extract data from various sources including databases, flat files, APIs, and cloud platforms. Transform and cleanse data to meet business and technical requirements. Load data into data warehouses, data lakes, or other target systems. Monitor and optimize ETL performance and troubleshoot issues. Collaborate with data architects, analysts, and business stakeholders to understand data requirements. Ensure data quality, integrity, and security throughout the ETL lifecycle.Document ETL processes, data flows, and technical specifications. - Grade Specific Focus on Industrial Operations Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.
Posted 3 days ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Ready to shape the future of work? At Genpact, we dont just adapt to changewe drive it. AI and digital innovation are redefining industries, and were leading the charge. Genpacts AI Gigafactory , our industry-first accelerator, is an example of how were scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team thats shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant, AWS DataLake! Responsibilities Having knowledge on DataLake on AWS services with exposure to creating External Tables and spark programming. The person shall be able to work on python programming. Writing effective and scalable Python codes for automations, data wrangling and ETL. ¢ Designing and implementing robust applications and work on Automations using python codes. ¢ Debugging applications to ensure low-latency and high-availability. ¢ Writing optimized custom SQL queries ¢ Experienced in team and client handling ¢ Having prowess in documentation related to systems, design, and delivery. ¢ Integrate user-facing elements into applications ¢ Having the knowledge of External Tables, Data Lake concepts. ¢ Able to do task allocation, collaborate on status exchanges and getting things to successful closure. ¢ Implement security and data protection solutions ¢ Must be capable of writing SQL queries for validating dashboard outputs ¢ Must be able to translate visual requirements into detailed technical specifications ¢ Well versed in handling Excel, CSV, text, json other unstructured file formats using python. ¢ Expertise in at least one popular Python framework (like Django, Flask or Pyramid) ¢ Good understanding and exposure on any Git, Bamboo, Confluence and Jira. ¢ Good in Dataframes and SQL ANSI using pandas. ¢ Team player, collaborative approach and excellent communication skills Qualifications we seek in you! Minimum Qualifications ¢BE/B Tech/ MCA ¢Excellent written and verbal communication skills ¢Good knowledge of Python, Pyspark Preferred Qualifications/ Skills ¢ Strong ETL knowledge on any ETL tool good to have. ¢ Good to have knowledge on AWS cloud and Snowflake. ¢ Having knowledge of PySpark is a plus. Why join Genpact? Be a transformation leader Work at the cutting edge of AI, automation, and digital innovation Make an impact Drive change for global enterprises and solve business challenges that matter Accelerate your career Get hands-on experience, mentorship, and continuous learning opportunities Work with the best Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Lets build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 3 days ago
7.0 - 12.0 years
15 - 27 Lacs
Bengaluru
Remote
THIS IS A FULLY REMOTE JOB WITH 5 DAYS WORK WEEK. THIS IS A ONE YEAR CONTRACT JOB, LIKELY TO BE CONTINUED AFTER ONE YEAR. Required Qualifications Education: B.Tech /M.Tech in Computer Science, Data Engineering, or equivalent field. Experience: 7-10 years in data engineering, with 2+ years in an industrial/operations-heavy environment (manufacturing, energy, supply chain, etc.) Job Role Senior Data Engineer will be responsible for independently designing, developing, and deploying scalable data infrastructure to support analytics, optimization, and AI-driven use cases in a low-tech maturity environment . You will own the data architecture end-to-end , work closely with data scientists , full stack engineers , and operations teams , and be a driving force in creating a robust Industry 4.0-ready data backbone. Key Responsibilities 1. Data Architecture & Infrastructure Design and implement a scalable, secure, and future-ready data architecture from scratch. Lead the selection, configuration, and deployment of data lakes, warehouses (e.g., AWS Redshift, Azure Synapse), and ETL/ELT pipelines. Establish robust data ingestion pipelines from PLCs, DCS systems, SAP, Excel files, and third-party APIs. Ensure data quality, governance, lineage, and metadata management. 2. Data Engineering & Tooling Build and maintain modular, reusable ETL/ELT pipelines using Python, SQL, Apache Airflow, or equivalent. Set up real-time and batch processing capabilities using tools such as Kafka, Spark, or Azure Data Factory. Deploy and maintain scalable data storage solutions and optimize query performance. Tech Stack Strong hands-on expertise in: Python, SQL, Spark, Pandas ETL tools: Airflow, Azure Data Factory, or equivalent Cloud platforms: Azure (preferred), AWS or GCP Databases: PostgreSQL, MS SQL Server, NoSQL (MongoDB, etc.) Data lakes/warehouses: S3, Delta Lake, Snowflake, Redshift, BigQuery Monitoring and Logging: Prometheus, Grafana, ELK, etc.
Posted 3 days ago
7.0 - 11.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Purpose of this Role You are responsible for defining commercially aware and technically astute solutions that both align to and inform architectural direction while balancing the typical constraints evident on project delivery, Your role is embedded within the Cigna International Markets Architecture function that works collaboratively with senior stakeholders to define strategic direction, thereafter, ensuring that intent is reflected in business solutions, You will be comfortable leading and defining effective business solutions within complex project environments, demonstrating the maturity to build strong working relationships across Business, IT, and 3rd Party stakeholders, Main Duties / Responsibilities Perform key enterprise-wide Data Architecture responsibilities within International Markets, focusing on our on-premise and cloud solution deployments, Proactively engage across Business, IT, and 3rd party stakeholders to ensure that the business investment delivers a cost-effective and appropriate data driven solutions for the organization, Assist sponsors in the creation of rounded and compelling business cases for change, Work with Solution Architects to drive the definition of the data solution design, mapping business and technical requirements to define data assets that meet both business and operational expectations, Own and manage data models, data design artefacts and provide guidance and consultancy on best practice and standards for customer focused data delivery and data management practices, Be an advocate for data driven design within an agile delivery framework, Actively participate in the full project lifecycle from early shaping of high-level estimates and delivery plans through to active governance of the solution as it is developed and built in later phases, Capture and manage risks, issues and assumptions identified through the lifecycle, articulating the financial and other impacts associated with these concerns, Take a lead role in the selection of 3rd Party solutions, developing successful partner relationships where required, Maintain an active awareness of emerging trends and developments in data design, architecture and enterprise technology that could impact or benefit our business and our customers, High-level mentoring of design & development teams to embed data architecture concepts, principles and best practices, Skills & Experience 10 years of IT experience and 5 years in a Data Architecture or Data Design role is required, Experience of leading data design delivering significant assets to an organization e-g Data Warehouse, Data Lake, Customer 360 Data Platform, Be able to demonstrate experience within some of the following data capabilities: Data modelling, database design (operational and / or analytical use cases), data migration, data quality management, metadata management, domain driven design, data integration, with a preference for ETL/ELT and data streaming experience Toolsets and platforms preferred are: AWS, SQL Server, Qlik toolsets, Collibra Track record of working successfully in a globally dispersed team would be beneficial, Breadth of experience and technical acumen across application, infrastructure, security, service management, business process, architecture capabilities, etc Highly collaborative and a desire to work with a broad range of stakeholders to achieve agreement on solutions that drive benefits for our customers and businesses, Commercial awareness incorporating financial planning and budgeting, About The Cigna Group Cigna Healthcare, a division of The Cigna Group, is an advocate for better health through every stage of life We guide our customers through the health care system, empowering them with the information and insight they need to make the best choices for improving their health and vitality Join us in driving growth and improving lives, Show
Posted 3 days ago
8.0 - 12.0 years
14 - 20 Lacs
Bengaluru
Work from Office
Azure Data Engineer Experience in Azure Data Factory, Databricks, Azure data lake and Azure SQL Server. Developed ETL/ELT process using SSIS and/or Azure Data Factory. Build complex pipelines & dataflows using Azure Data Factory. Designing and implementing data pipelines using in Azure Data Factory (ADF). Improve functionality/ performance of existing data pipelines. Performance tuning processes dealing with very large data sets. Configuration and Deployment of ADF packages. Proficient of the usage of ARM Template, Key Vault, Integration runtime. Adaptable to work with ETL frameworks and standards. Strong analytical and troubleshooting skill to root cause issue and find solution. Propose innovative, feasible and best solutions for the business requirements. Knowledge on Azure technologies / services such as Blob storage, ADLS, Logic Apps, Azure SQL, Web Jobs.. Expert in Service now , Incidents ,JIRA. Should have exposure agile methodology. Expert in understanding , building powerBI reports using latest methodologies
Posted 3 days ago
8.0 - 13.0 years
25 - 40 Lacs
Mumbai, Hyderabad
Work from Office
Essential Services: Role & Location fungibility At ICICI Bank, we believe in serving our customers beyond our role definition, product boundaries, and domain limitations through our philosophy of customer 360-degree. In essence, this captures our belief in serving the entire banking needs of our customers as One Bank, One Team . To achieve this, employees at ICICI Bank are expected to be role and location-fungible with the understanding that Banking is an essential service . The role descriptions give you an overview of the responsibilities, it is only directional and guiding in nature. About the Role: As a Data Warehouse Architect, you will be responsible for managing and enhancing data warehouse that manages large volume of customer-life cycle data flowing in from various applications within guardrails of risk and compliance. You will be managing the day-to-day operations of data warehouse i.e. Vertica. In this role responsibility, you will manage a team of data warehouse engineers to develop data modelling, designing ETL data pipeline, issue management, upgrades, performance fine-tuning, migration, governance and security framework of the data warehouse. This role enables the Bank to maintain huge data sets in a structured manner that is amenable for data intelligence. The data warehouse supports numerous information systems used by various business groups to derive insights. As a natural progression, the data warehouses will be gradually migrated to Data Lake enabling better analytical advantage. The role holder will also be responsible for guiding the team towards this migration. Key Responsibilities: Data Pipeline Design: Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure. Issue Management: Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance. Collaboration: Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements. Data Integration and Processing: Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent. Data Modelling: Responsible for designing and implementing data modelling solutions to ensure that the organizations data is properly structured and organized for analysis. Key Qualifications & Skills: Education Qualification: B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 10 to 12 years of experience and at least 5 years or relevant work experience in Datawarehouse/ mining/BI/MIS. Experience in Data Warehousing: Knowledge on ETL and data technologies and outline future vision in OLTP, OLAP (Oracle / MS SQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableu etc). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory. Synergize with the Team: Regular interaction with business/product/functional teams to create mobility solutions. Certification: Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications. About the Business Group The Technology Group at ICICI Bank is at the forefront of our operations and offerings, which are focused on leveraging state-of-the-art technology to provide customer-centric solutions. This group plays a pivotal role in our vision of the transition from Bank to Bank Tech. Further, the group offers round-the-clock support to our entire banking ecosystem. In our persistent efforts to provide products and solutions that genuinely touch customers, unlocking the potential of technology in every single engagement would go a long way in creating customer delight. In this endeavor, we also tirelessly ensure all our processes, systems, and infrastructure are very well within the guardrails of data security, privacy, and relevant regulations.
Posted 3 days ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
Are you an analytic thinker who enjoys creating valuable insights with data Do you want to play a key role in transforming our firm into an agile organization At UBS, we re-imagine the way we work, connect with each other - our colleagues, clients, and partners - and deliver value. Being agile will make us more responsive, adaptable, and ultimately more innovative. We are looking for a Data Engineer to transform data into valuable insights that inform business decisions, utilizing our internal data platforms and applying appropriate analytical techniques. You will be responsible for engineering reliable data pipelines for sourcing, processing, distributing, and storing data in different ways, effectively using data platform infrastructure. Additionally, you will develop, train, and apply machine-learning models to make better predictions, automate manual processes, and solve challenging business problems. Ensuring the quality, security, reliability, and compliance of our solutions by applying digital principles and implementing both functional and non-functional requirements is a key aspect of this role. You will also be involved in building observability into our solutions, monitoring production health, helping to resolve incidents, and remediating the root cause of risks and issues while understanding, representing, and advocating for client needs. The WMA Data Foundational Platforms & Services Crew is the fuel for the WMA CDIO, providing the foundational, disruptive, and modern platform and technologies. The mission is rooted in the value proposition of a shared, foundational platform across UBS to maximize business value. To be successful in this role, you should have a bachelor's degree in Computer Science, Engineering, or a related field, along with 15+ years of experience in strong proficiency with Azure cloud services related to data and analytics (Azure SQL, Data Lake, Data Factory, Databricks, etc.). Experience with SQL and data modeling, as well as familiarity with NoSQL databases, is essential. Proficiency in programming languages such as Python or Scala, and knowledge of data warehousing and data lake concepts and technologies are also required. UBS, the world's largest and only truly global wealth manager, operates through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management, and the Investment Bank. With a presence in all major financial centers in more than 50 countries, our global reach and expertise set us apart from our competitors. At UBS, we value our people and their diverse skills, experiences, and backgrounds as the driving force behind our ongoing success. We are dedicated to our craft, passionate about putting our people first, offering new challenges, a supportive team, opportunities to grow, and flexible working options when possible. Our inclusive culture brings out the best in our employees at every stage of their career journey. Collaboration is at the heart of everything we do because together, we are more than ourselves. UBS is committed to disability inclusion, and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. UBS is an Equal Opportunity Employer that respects and seeks to empower each individual, supporting diverse cultures, perspectives, skills, and experiences within our workforce.,
Posted 4 days ago
10.0 - 17.0 years
40 - 55 Lacs
Hyderabad
Hybrid
Must Have 5-8 years as an Engineer, 3+ years as a Manager Should have led a data application with full stack and backend components 3+ years experience with API development 3+ years experience with Java/Golang backend development Nice to Have 3+ years experience with big data technologies and data lakes/data warehouses
Posted 4 days ago
10.0 - 15.0 years
25 - 35 Lacs
Pune
Work from Office
As bp transitions to an integrated energy company, we must adapt to a changing world and maintain driven performance. Bp's customers & products (C&P) business area is setting up a business and technology centre (BTC) in Pune, India. This will support the delivery of an enhanced customer experience and drive innovation by building global capabilities at scale, maximising technology, and developing deep expertise . The BTC will be a core and connected part of our business, bringing together colleagues who report into their respective part of C&P, working together with other functions across bp. This is an exciting time to join bp and the customers & products BTC! Shift Timing : 12.30 PM - 9.30 PM IST About the role: As the Data Specialist - Pricing, you will work alongside business leads for specific process areas, partnering with I&E, GBS, PUs, HUBs, Functions, and Markets to handle change, adoption, and sustainability of pricing frameworks aligned with the Business Strategy. This role will drive strategic pricing data lineage in the Data Lake, ensuring that the end state in the given process area is sustainable and operationally aligned with business leadership expectations and the Business Strategy. You are able to apply your expertise in the depth of the process hand-offs, the process links to the ERP transaction and related master data. The role drives strategic transformation, ensuring sustainable and operational alignment with leadership expectations. It requires fluency in agile methodology, acting and flexing as an SME, product owner, or scrum master based on project needs! Key Responsibilities: • Lead pricing data lineage from ERP & pricing source systems, normalisation and harmonization into star schema format for Data Lake embedding. • Approve required changes to pricing data, data structures, pricing metadata and master data. • Ensure pricing related processes and data are fit for purpose • Guide authorities through pricing data validation and query resolution • Support Global ERP systems processes and pricing data management during deployment to ensure alignment with project objectives and timelines • Support pricing data cutover and power outage phases • Support integration sessions with process architects to handle pricing data changes • Applying end-to-end strategic views to operational changes for process optimization, transactional fluidity, master data management and improved business performance. • Creating and supporting the execution of the Business Change Backlog to deliver incremental business change. • Advising & supporting planning and deployment activities to embed and sustain change. • Advising and supporting Data Governance related to Master Data Quality Management and Data performance related to transactional fluidity. • Acting as the process TAG for ERP design and setup, support Data Modelling for relevant data sources related to the Sales & Marketing Value Centres in the Castrol Data & Analytics landscape. • Recommending improvements and capability development to the Global Data & Analytics Lead and customers. • Collaborate with peer groups, using expertise across the subject area and drive integration • Working across timezones and lead multi disciplinary teams at a project or initiative bases. • Have a solutioning approach to think and build global with the ability to scale to local with tactical short term and strategic midterm delivery/alignment. Experience Required: • Experience in Pricing Operations with a minimum of 10 year operations in multiple aspects if the Sales Value Chain process execution. • Shown experience and deep domain knowledge of working with all affiliated pricing and marketing data objects • Shown deep experience of efficiently delivering business transformation as part of major ERP implementations and/or major business transformation projects. • Deep understanding in specified data & process area: Product portfolio management, Pricing waterfall, Net Hard Floor, rebates, Pricing conditions, dynamic pricing algorithms, Market & Pricing Intelligence, Pricing forecasting, pricing elasticity and the integration points with other data and process areas to ensure successful delivery of end to end. • Tenacious in getting issues resolved and collaborative solution - oriented thinking while balancing business strategy and process frameworks. Knowledge & Skills Required: • Good understanding of pricing data objects and their role in end-to-end processes • Familiarity with source & target systems and the role of pricing data within them ea: SAP, JDE • Trained in Agile methodology • Work across multiple levels of detail: data (master data & transactional), process design principles, operating model intent and systems design • Strong influencing skills and change leader to bring expertise and experience to shape value delivery • Consistent record of successful deployment in own area, across input and output success criteria measures You will work with : You will be part of a 20 headed Global team called Global Data & Analytics Team. You will operate peer to peer in a team of global best-in-class authorities on Process, Data, Advanced Analytics skill and Data Science. The Global Data & Analytics team reports into the Castrol Digital Enablement team that is running the digital estate for Castrol where we enhance scalability, process and data integration. This D&A team is the driving force behind the Data & Analytics strategy being responsible for the Harmonized Data Lake and the Business Intelligence derived from it, in support of the Business strategy and is a key pilar of value enablement through fast and accurate insights. You will engage will be exposed to a wide variety of customers in all layers of the Castrol Leadership and our partners in GBS and Technology. Through Data Governance at Value centre you have excellent exposure to the operations and have the ability to influence and lead change through value preposition engagements. Within the team we foster an open & inclusive culture where the collective powers the high quality outcome and speed of delivery. It is a team that stands on each other's shoulders to always be part of the solution and deliver towards optimal outcome.
Posted 4 days ago
3.0 - 6.0 years
11 - 20 Lacs
Bengaluru
Work from Office
Role & responsibilities We are seeking a skilled Data Engineer to maintain robust data infrastructure and pipelines that support our operational analytics and business intelligence needs. Candidates will bridge the gap between data engineering and operations, ensuring reliable, scalable, and efficient data systems that enable data-driven decision making across the organization. Strong proficiency in Spark SQL, hands-on experience with realtime Kafka, Flink Databases: Strong knowledge of relational databases (Oracle, MySQL) and NoSQL systems Proficiency with Version Control Git, CI/CD practices and collaborative development workflow Strong operations management and stakeholder communication skills Flexibility to work cross time zone Have cross-cultural communication mindset Experience working in cross-functional teams Continuous learning mindset and adaptability to new technologies Preferred candidate profile Bachelor's degree in Computer Science, Engineering, Mathematics, or related field 3+ years of experience in data engineering, software engineering, or related role Proven experience building and maintaining production data pipelines Expertise in Hadoop ecosystem - Spark SQL, Iceberg, Hive etc. Extensive experience with Apache Kafka, Apache Flink, and other relevant streaming technologies. Orchestrating tools - Apache Airflow & UC4, Proficiency in Python, Unix or similar languages Good understanding of SQL, oracle, SQL server, Nosql or similar languages Proficiency with Version Control Git, CI/CD practices and collaborative development workflows Preferrable immeidate joiner to less than 30days np
Posted 4 days ago
10.0 - 14.0 years
15 - 25 Lacs
Noida, Bengaluru, Delhi / NCR
Hybrid
We are looking for a Senior Solution Architect with 10+ years of experience in software development and AI/ML, including 5+ years in Generative AI and Azure ML, with hands on experience in designing and deployment of scalable AI solutions, drive innovation, and collaborate across teams to integrate cutting-edge technologies into business processes. Responsibilities & Skills: Architect and implement AI solutions using Generative AI, ML, and Azure ML aligned with business goals. Lead development of Generative AI models (e.g., GPT, DALL•E, BERT) for NLP, computer vision, and automation. Expert in Python and ML frameworks (TensorFlow, PyTorch, Scikit-learn); strong understanding of learning techniques. Hands-on experience with Azure ML, Azure AI Services, Data Lake, Databricks, and MLOps via Azure DevOps. Collaborate with cross-functional teams and mentor junior engineers to foster innovation and best practices. Advise leadership on AI strategy, emerging technologies, and scalable architecture. Define data strategy for AI/ML, ensuring governance, security, and compliance. Stay ahead of AI trends and prototype new solutions for continuous improvement. Engage with stakeholders to translate business needs into technical solutions. Experience with AWS/GCP, hybrid cloud, CI/CD for ML, and certifications in Azure AI/Architect. Familiarity with NLP, computer vision, and reinforcement learning. Proven track record in AI/ML solution architecture and cloud-native deployments. Strong leadership, communication, and stakeholder management skills. Bachelors/Masters in Computer Science or related field; Ph.D. preferred.
Posted 4 days ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will be responsible for understanding business requirements, data mappings, create and maintain data models through different stages using data modeling tools and handing over the physical design/DDL scripts to the data engineers for implementations of the data models. Your role involves creating and maintaining data models, ensuring performance and quality or deliverables.Experience- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+Key Responsibilities:- Drive discussions with clients teams to understand business requirements, develop Data Models that fits in the requirements- Drive Discovery activities and design workshops with client and support design discussions- Create Data Modeling deliverables and getting sign-off - Develop the solution blueprint and scoping, do estimation for delivery project. Technical Experience:Must Have Skills: - 7+ year overall IT experience with 3+ years in Data Modeling- Data modeling experience in Dimensional Modeling/3-NF modeling/No-SQL DB modeling- Should have experience on at least one Cloud DB Design work- Conversant with Modern Data Platform- Work Experience on data transformation and analytic projects, Understanding of DWH.- Instrumental in DB design through all stages of Data Modeling- Experience in at least one leading Data Modeling Tools e.g. Erwin, ER Studio or equivalentGood to Have Skills: - Any of these add-on skills - Data Vault Modeling, Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration- Must be familiar with Data Architecture Principles.Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E. or B.Tech. must Qualification 15 years full time education
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The data lake job market in India is experiencing significant growth as organizations continue to invest in big data technologies to drive business insights and decision-making. Data lake professionals are in high demand across various industries, offering lucrative career opportunities for job seekers with relevant skills and experience.
The average salary range for data lake professionals in India varies based on experience levels. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.
Typically, a career in data lake progresses from roles such as Data Engineer or Data Analyst to Senior Data Engineer, Data Architect, and eventually to a Data Science Manager or Chief Data Officer. Advancement in this field is often based on gaining experience working with large datasets, implementing data management best practices, and demonstrating strong problem-solving skills.
In addition to expertise in data lake technologies like Apache Hadoop, Apache Spark, and AWS S3, data lake professionals are often expected to have skills in data modeling, data warehousing, SQL, programming languages like Python or Java, and experience with ETL (Extract, Transform, Load) processes.
As the demand for data lake professionals continues to rise in India, job seekers should focus on honing their skills in big data technologies and data management practices to stand out in the competitive job market. Prepare thoroughly for interviews by mastering both technical and conceptual aspects of data lake architecture and be confident in showcasing your expertise to potential employers. Good luck in your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough