Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Do you like working with data and analytics to gain insight to solve problems? Do you enjoy collaborating across teams to build and deliver products that make a difference? Join Our Inclusive Team About The Team Within Elsevier Operations, Platform Operations is responsible for ensuring that Product content meets quality standards, is delivered on time and within budget, and made available to end-users via Elsevier's product websites such as Knovel, Engineering Village, and Scopus. About The Role The Senior Production Manager is a member of the Engineering Segment and leads support for the Engineering Collection (Engineering Village-EV). The successful candidate takes ownership of end-to-end production workflows and process improvements and is responsible for key decisions related to content analysis, content production, and content delivery. Success in this role requires knowledge of publishing and bibliographic metadata standards, and the ability to correlate multiple data sets to one or more strategic priorities. Responsibilities Build and maintain strong relationships with EV Product and Content teams Develop knowledge of the research landscape to understand EV use cases, product requirements and product vision Understand and coordinate development of workflows for content types currently outside of RDP (e.g., Standards, Patents, Pre-Prints, Dissertations) Working with suppliers (in consultation with Supplier Management), serve as the project manager for optimization of existing workflows and development of new workflows and ensure successful delivery of content to EV Improve data quality with a focus on completeness and error reduction Identify key metrics and work with CDA team to deliver dashboards and visualizations Organize and lead stakeholder meetings to review product health and align priorities Assist customer support to resolve customer-reported issues quickly and successfully Prepare budget forecasts and track spending on production and indexing by suppliers Requirements Strong analytical skills and facility with analytics tools Ability to dive into data, frame hypotheses and arrive at logical conclusions Ability to create reliable data that can stand along or be integrated with other data sets Strong communication skills Strong research skills Project management, business process management (businessoptix), stakeholder management Minimum one year working with a product development team Minimum one year of exposure to agile methodologies Familiarity with data analysis methods and tools for handling large data sets (e.g., Databricks) Familiarity with markup languages (e.g., XML), query languages (e.g., SQL) and scripting languages (e.g., Python) Knowledge of bilbiographic metadata and publishing standards and best practices Project and stakeholder management Leading Change: Champions Change Focus on Results: Drives for Results Focus on Results: Takes initiative Personal Capability: Solves Problems & Analyzes Issues Personal Capability: Practices Self-Development Interpersonal Skills: Collaboration & Teamwork Interpersonal Skills: Builds Relationships Working With Us We promote a healthy work/life balance across the organisation. We offer an appealing working prospect for our people. With numerous wellbeing initiatives, shared parental leave, study assistance and sabbaticals, we will help you meet your immediate responsibilities and your long-term goals. Working For You We know that your wellbeing and happiness are key to a long and successful career. These are some of the benefits we are delighted to offer: Comprehensive Health Insurance: Covers you, your immediate family, and parents. Enhanced Health Insurance Options: Competitive rates negotiated by the company. Group Life Insurance: Ensuring financial security for your loved ones. Group Accident Insurance: Extra protection for accidental death and permanent disablement. Flexible Working Arrangement: Achieve a harmonious work-life balance. Employee Assistance Program: Access support for personal and work-related challenges. Medical Screening: Your well-being is a top priority. Modern Family Benefits: Maternity, paternity, and adoption support. Long-Service Awards: Recognizing dedication and commitment. New Baby Gift: Celebrating the joy of parenthood. Subsidized Meals in Chennai: Enjoy delicious meals at discounted rates. Various Paid Time Off: Take time off with Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport pick up and drop from the home -office - home (applies in Chennai) About The Business We are a global leader in information and analytics, helping researchers and healthcare professionals' advance science and improve health outcomes. Building on our publishing heritage, we combine quality information and vast data sets with analytics to support visionary science, research, health education, interactive learning, and exceptional healthcare and clinical practice. At Elsevier, your work contributes to addressing the world's grand challenges and creating a more sustainable future. We harness innovative technologies to support science and healthcare, partnering for a better world. Show more Show less
Posted 14 hours ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Vestas is a major player in wind technology and a motivation in the development of the wind power industry. Vestas' core business comprises the development, manufacture, sale, marketing, and maintenance of Wind Turbines. Come and join us at Vestas! Digital Solutions & Development > Digital Architecture & Data & AL , Data Domains & AI > Data Domain - Tech Area Responsibilities Create and maintain scalable data pipelines for analytics use cases assembling large, complex data sets that meet functional & non-functional business requirements Develop logical & physical data models using optimal data model structure for data warehouse and data mart designs to support analytical needs Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Collaborate with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality Hands-on role (100%) - building data solutions using best practices and architecture recommendations Qualifications Bachelor's / Master's in engineering (Degree in Computer Science, IT, Engineering or similar) Work experience as Data Engineer as part of Data & Analytics team, with 3+ years of relevant work experience and an overall experience of 6-10 years Data Engineering Experience: Advanced working SQL knowledge and experience in building & maintaining scalable ETL/EL data pipelines to support continuing increase in data volume and complexity Enterprise working experience in business intelligence/analytics teams supporting design, development, and maintenance of backend data layer for BI/ML solutions Deep understanding of data structure / data models to design and develop data solutions ensuring data availability, security, and accessibility Competencies Tools/Technologies/Frameworks: Expertise in working with various Data Warehouse solutions and constructing data products using technologies such as Snowflake, Databricks, Azure Data Engineering Stack (like storage accounts, key vaults, MS SQL, etc.) is mandatory Strong work experience in SQL/Stored procs and relational modeling to build data layer for BI/analytics is mandatory Extensive hands-on data modelling experience in cloud data warehouse and data structures. Hands-on working experience in one of the ETL/EL tools like DBT/Azure Data Factory/SSIS will be an advantage Proficiency in code management / version control tools such as GIT, DevOps Business/Soft Skills: Strong in data/software engineering fundamentals; experience in an Agile/Scrum environment preferred Ability to communicate with stakeholders across different geographies and collaborate with analytics & data science teams to match technical solutions with customer business requirements Familiar with business metrics such as KPIs, PPIs and other indicators Curious and passionate about building value-creating and innovative data solutions What We Offer An opportunity to impact climate change and the future of next generations through data, analytics, cloud and machine learningSteep learning curve. We are building a strong team of Data Engineers with both broad and deep knowledge. That means that everyone will have somebody to learn from, just as we will invest in continuous learning, knowledge sharing and upskilling Strong relationships. We will strive to build an environment of mutual trust and a tightly knit team, where we can support and inspire each other to deliver great impact for Vestas Opportunity to shape your role. We have been asked to scale and deliver data & insights products. The rest is up to us Healthy work life balance. Commitment to fostering a diverse and inclusive workplace environment where everyone can thrive and bring their unique perspectives and skills to the team Overall, we offer you the opportunity to make a difference and work in a multicultural international company, where you have the opportunity to improve your skills and grow professionally to reach new heights Additional Information Your primary workplace will be Chennai. Please note: We do amend or withdraw our jobs and reserve the right to the right to do so at any time, including prior to the advertised closing date. Please be advised to apply on or before 16th July 2025. BEWARE – RECRUITMENT FRAUD It has come to our attention that there are a number of fraudulent emails from people pretending to work for Vestas. Read more via this link, https://www.vestas.com/en/careers/our-recruitment-process DEIB Statement At Vestas, we recognise the value of diversity, equity, and inclusion in driving innovation and success. We strongly encourage individuals from all backgrounds to apply, particularly those who may hesitate due to their identity or feel they do not meet every criterion. As our CEO states, "Expertise and talent come in many forms, and a diverse workforce enhances our ability to think differently and solve the complex challenges of our industry". Your unique perspective is what will help us powering the solution for a sustainable, green energy future. About Vestas Vestas is the energy industry’s global partner on sustainable energy solutions. We are specialised in designing, manufacturing, installing, and servicing wind turbines, both onshore and offshore. Across the globe, we have installed more wind power than anyone else. We consider ourselves pioneers within the industry, as we continuously aim to design new solutions and technologies to create a more sustainable future for all of us. With more than 185 GW of wind power installed worldwide and 40+ years of experience in wind energy, we have an unmatched track record demonstrating our expertise within the field. With 30,000 employees globally, we are a diverse team united by a common goal: to power the solution – today, tomorrow, and far into the future. Vestas promotes a diverse workforce which embraces all social identities and is free of any discrimination. We commit to create and sustain an environment that acknowledges and harvests different experiences, skills, and perspectives. We also aim to give everyone equal access to opportunity. To learn more about our company and life at Vestas, we invite you to visit our website at www.vestas.com and follow us on our social media channels. We also encourage you to join our Talent Universe to receive notifications on new and relevant postings. Show more Show less
Posted 14 hours ago
5.0 - 10.0 years
0 Lacs
Cochin
On-site
Orion Innovation is a premier, award-winning, global business and technology services firm. Orion delivers game-changing business transformation and product development rooted in digital strategy, experience design, and engineering, with a unique combination of agility, scale, and maturity. We work with a wide range of clients across many industries including financial services, professional services, telecommunications and media, consumer products, automotive, industrial automation, professional sports and entertainment, life sciences, ecommerce, and education. Data Engineer Locations- Kochi/Chennai/Coimbatore/Mumbai/Pune/Hyderabad Job Overview : We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team. The ideal candidate will have deep expertise in Azure Databricks and Python, and experience building scalable data pipelines. Familiarity with Data Fabric architectures is a plus. You'll work closely with data scientists, analysts, and business stakeholders to deliver robust data solutions that drive insights and innovation. Key Responsibilities: Design, build, and maintain large-scale, distributed data pipelines using Azure Databricks and Py Spark. Design, build, and maintain large-scale, distributed data pipelines using Azure Data Factory Develop and optimize data workflows and ETL processes in Azure Cloud environments. Write clean, maintainable, and efficient code in Python for data engineering tasks. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. • Monitor and troubleshoot data pipelines for performance and reliability issues. • Implement data quality checks, validations, and ensure data lineage and governance. Contribute to the design and implementation of a Data Fabric architecture (desirable). Required Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5–10 years of experience in data engineering or related roles. • Expertise in Azure Databricks, Delta Lake, and Spark. • Strong proficiency in Python, especially in a data processing context. Experience with Azure Data Lake, Azure Data Factory, and related Azure services. Hands-on experience in building data ingestion and transformation pipelines. Familiarity with CI/CD pipelines and version control systems (e.g., Git). Good to Have: Experience or understanding of Data Fabric concepts (e.g., data virtualization, unified data access, metadata-driven architectures). • Knowledge of modern data warehousing and lakehouse principles. • Exposure to tools like Apache Airflow, dbt, or similar. Experience working in agile/scrum environments. DP-500 and DP-600 Certifications What We Offer: Competitive salary and performance-based bonuses. Flexible work arrangements. Opportunities for continuous learning and career growth. A collaborative, inclusive, and innovative work culture. www.orioninc.com (21) Orion Innovation: Company Page Admin | LinkedIn Orion is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, citizenship status, disability status, genetic information, protected veteran status, or any other characteristic protected by law. Candidate Privacy Policy Orion Systems Integrators, LLC and its subsidiaries and its affiliates (collectively, "Orion," "we" or "us") are committed to protecting your privacy. This Candidate Privacy Policy (orioninc.com) ("Notice") explains: What information we collect during our application and recruitment process and why we collect it; How we handle that information; and How to access and update that information. Your use of Orion services is governed by any applicable terms in this notice and our general Privacy Policy.
Posted 14 hours ago
10.0 years
4 - 8 Lacs
Hyderābād
On-site
Summary Internal title: Assoc.Dir. DDIT US&I Data Architect #LI-Hybrid Location: Hyderabad, India Relocation Support: Yes Step into a pivotal role where your expertise in data architecture will shape the future of analytics at Novartis. As Associate Director - Data Architect, you’ll lead the design and implementation of innovative data solutions that empower business decisions and drive digital transformation. This is your opportunity to influence enterprise-wide strategies, collaborate with cross-functional teams, and bring emerging technologies to life—all while making a meaningful impact on global healthcare. About the Role Key Responsibilities Design and implement scalable data architecture solutions aligned with business strategy and innovation goals Lead architecture for US&I Analytics Capabilities including GenAI, MLOps, NLP, and data visualization Collaborate with cross-functional teams to ensure scalable, future-ready data solutions Define and evolve architecture governance frameworks, standards, and best practices Drive adoption of emerging technologies through rapid prototyping and enterprise-scale deployment Architect data solutions using AWS, Snowflake, Databricks, and other modern platforms Oversee delivery of data lake projects including acquisition, transformation, and publishing Ensure data security, governance, and compliance across all architecture solutions Promote a data product-centric approach to solution design and delivery Align innovation efforts with business strategy, IT roadmap, and regulatory requirements Essential Requirements Bachelor’s degree in computer science, engineering, or a related field Over 10 years of experience in analytical and technical frameworks for descriptive and prescriptive analytics Strong expertise in AWS, Databricks, and Snowflake service offerings Proven experience delivering data lake projects from acquisition to publishing Deep understanding of data security, governance policies, and enforcement mechanisms Agile delivery experience managing multiple concurrent delivery cycles Strong knowledge of MLOps and analytical data lifecycle management Excellent communication, problem-solving, and cross-functional collaboration skills Desirable Requirements Experience working with pharmaceutical data and familiarity with global healthcare data sources Exposure to regulatory frameworks and compliance standards in the life sciences industry Commitment to Diversity and Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division Operations Business Unit CTS Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Technology Transformation Job Type Full time Employment Type Regular Shift Work No
Posted 14 hours ago
7.0 years
0 Lacs
Hyderābād
On-site
Digital Solutions Consultant I - HYD015Q Company : Worley Primary Location : IND-AP-Hyderabad Job : Digital Solutions Schedule : Full-time Employment Type : Agency Contractor Job Level : Experienced Job Posting : Jun 16, 2025 Unposting Date : Jul 16, 2025 Reporting Manager Title : Senior General Manager : We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities: Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley.
Posted 14 hours ago
5.0 - 8.0 years
4 - 9 Lacs
Hyderābād
On-site
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver No Performance Parameter Measure 1 Process No. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2 Team Management Productivity, efficiency, absenteeism 3 Capability development Triages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 14 hours ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company: Kanerika Inc Website: Visit Website Business Type: Startup Company Type: Product & Service Business Model: B2B Funding Stage: Pre-seed Industry: IT Services Job Description Roles and Responsibilities : The Jr .NET Data Engineer will be responsible for designing and developing scalable backend systems using .NET Core, Web API, and Azure-based data engineering tools like Databricks, MS Fabric, or Snowflake. They will build and maintain data pipelines, optimize SQL/NoSQL databases, and ensure high-performance systems through design patterns and microservices architecture. Strong communication skills and the ability to collaborate with US counterparts in an Agile environment are essential. Experience with Azure DevOps, Angular, and MongoDB is a plus. Technical Skills Strong hands-on experience on C#, SQL Server, OOPS Concepts, Micro Services Architecture. At least one-year hands-on experience on .NET Core, ASP.NET Core, Web API, SQL, No SQL, Entity Framework 6 or above, Azure, Database performance tuning, Applying Design Patterns, Agile. Net back-end development with data engineering expertise. Must have experience with Azure Data Engineering, Azure Databricks, MS Fabric as data platform/ Snowflake or similar tools. Skill for writing reusable libraries. Excellent Communication skills both oral & written. Excellent troubleshooting and communication skills, ability to communicate clearly with US counter parts What we need? Educational Qualification: B.Tech, B.E, MCA, M.Tech. Work Mode: Must be willing to work from the office (onsite only). Nice To Have Knowledge on Angular, Mongo DB, NPM and Azure Devops Build/ Release configuration. Self – Starter with solid analytical and problem- solving skills. This is an experienced level position, and we train the qualified candidate in the required applications. Willingness to work extra hours to meet deliverables. Show more Show less
Posted 14 hours ago
8.0 years
2 - 8 Lacs
Gurgaon
On-site
Requisition Number: 101352 Architect I - Data Location: This is a hybrid opportunity in Delhi-NCR, Bangalore, Hyderabad, Gurugram area. Insight at a Glance 14,000+ engaged teammates globally with operations in 25 countries across the globe. Received 35+ industry and partner awards in the past year $9.2 billion in revenue #20 on Fortune’s World's Best Workplaces™ list #14 on Forbes World's Best Employers in IT – 2023 #23 on Forbes Best Employers for Women in IT- 2023 $1.4M+ total charitable contributions in 2023 by Insight globally Now is the time to bring your expertise to Insight. We are not just a tech company; we are a people-first company. We believe that by unlocking the power of people and technology, we can accelerate transformation and achieve extraordinary results. As a Fortune 500 Solutions Integrator with deep expertise in cloud, data, AI, cybersecurity, and intelligent edge, we guide organisations through complex digital decisions. About the role As an Architect I , you will focus on leading our Business Intelligence (BI) and Data Warehousing (DW) initiatives. We will count on you to be involved in designing and implementing end-to-end data pipelines using cloud services and data frameworks. Along the way, you will get to: Architect and implement end-to-end data pipelines, data lakes, and warehouses using modern cloud services and architectural patterns. Develop and build analytics tools that deliver actionable insights to the business. Integrate and manage large, complex data sets to meet strategic business requirements. Optimize data processing workflows using frameworks such as PySpark. Establish and enforce best practices for data quality, integrity, security, and performance across the entire data ecosystem. Collaborate with cross-functional teams to prioritize deliverables and design solutions. Develop compelling business cases and return on investment (ROI) analyses to support strategic initiatives. Drive process improvements for enhanced data delivery speed and reliability. Provide technical leadership, training, and mentorship to team members, promoting a culture of excellence. What we’re looking for 8+ years in Business Intelligence (BI) solution design, with 6+ years specializing in ETL processes and data warehouse architecture. 6+ years of hands-on experience with Azure Data services including Azure Data Factory, Azure Databricks, Azure Data Lake Gen2, Azure SQL DB, Synapse, Power BI, and MS Fabric. Strong Python and PySpark software engineering proficiency, coupled with a proven track record of building and optimizing big data pipelines, architectures, and datasets. Proficient in transforming, processing, and extracting insights from vast, disparate datasets, and building robust data pipelines for metadata, dependency, and workload management. Familiarity with software development lifecycles/methodologies, particularly Agile. Experience with SAP/ERP/Datasphere data modeling is a significant plus. Excellent presentation and collaboration skills, capable of creating formal documentation and supporting cross-functional teams in a dynamic environment. Strong problem-solving, time management, and organizational abilities. Keen to learn new languages and technologies continually. Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or an equivalent field What you can expect We’re legendary for taking care of you, your family and to help you engage with your local community. We want you to enjoy a full, meaningful life and own your career at Insight. Some of our benefits include: Freedom to work from another location—even an international destination—for up to 30 consecutive calendar days per year. Medical Insurance Health Benefits Professional Development: Learning Platform and Certificate Reimbursement Shift Allowance But what really sets us apart are our core values of Hunger, Heart, and Harmony, which guide everything we do, from building relationships with teammates, partners, and clients to making a positive impact in our communities. Join us today, your ambITious journey starts here. When you apply, please tell us the pronouns you use and any reasonable adjustments you may need during the interview process. At Insight, we celebrate diversity of skills and experience so even if you don’t feel like your skills are a perfect match - we still want to hear from you! Today's talent leads tomorrow's success. Learn more about Insight: https://www.linkedin.com/company/insight/ Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. Insight India Location:Level 16, Tower B, Building No 14, Dlf Cyber City In It/Ites Sez, Sector 24 &25 A Gurugram Gurgaon Hr 122002 India
Posted 14 hours ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role: Senior Manager Location: Bengaluru WHAT YOU’LL DO We’re MiQ, a global programmatic media partner for marketers and agencies. Our people are at the heart of everything we do, so you will be too. No matter the role or the location, we’re all united in the vision to lead the programmatic industry and make it better. As part of the DnA team, you will lead a team of analysts driving the analytics delivery on digital media campaigns for a specific market or region within MiQ. You would be a part of the the DnA leadership responsible to define strategic priorities for the team that would help drive revenue growth, market sustainability and account innovation. You’d be responsible for team development, operational excellence, building analytics expertise in the team and sharing new learnings/ analytics best practices across the business. Develop strong commercial awareness, identify opportunities to grow business and actively take part in market and account objective setting. Focus on Growth and Retain strategy: Conceptualise and propose solutions to address business challenges. Being part of the leadership team, enhance the analytics and DS solutions outlook of MiQ’s offering. Build and strengthen relationship with commercial leaders and play an influential role in sales, client services, trading, solutions etc. teams. Lead by example! Be a hands-on leader demonstrating strong business, technical and functional knowledge. Work with DnA leadership to identify focus areas and build department level short/long term strategy baking in micro and macro factors. Provide analytics and data science project leadership and oversee development, deployment, and adoption of solutions in the specific market and in DnA. Responsible for providing technical and analytics expertise to the team and to bring in better ways of analytics/problem solving to the team continuously. Play key stakeholder role for Product & Tech developments and spearhead internal tool adoption within the market and team Set performance standards for the team! Own the OKRs creation, development planning, L&D plan, feedback and performance appraisals for the team. Drive operational excellence: Setup processes & frameworks for effort & cost tracking, effectively measure the impact of delivered outcomes. Ensure effective resource planning for the market by forecasting demand and making data backed assumptions. Develop a culture of feedback and continuous learning within the team. Ensure team develops an experimental and innovation focussed mindset and finds newer efficient ways of doing things. Have an innovative and transformation mindset to identify improvement opportunities to optimize processes, decrease costs and increase client/business value. Manage team wellbeing and ensure team is engaged. Active involvement in recruitment, branding and external event participation. Who are your stakeholders? As an Senior Manager you are required to work with different stakeholders across the MiQ ecosystem: Programmatic Traders : DnA collaborates with traders to optimize campaigns. By leveraging our data analysis skills & understanding of the data landscape, we provide insights on audience targeting, ad performance, and bidding strategies. This helps traders make data-driven decisions, optimize their advertising campaigns, and improve overall campaign effectiveness and ROI. Account Managers : We work closely with account managers to leverage the power of data partnerships. Through our analysis, we help uncover valuable insights about customer behavior, market trends, and campaign performance. This information allows account managers to create a compelling narrative, enhance engagement with advertisers, and showcase the effectiveness of MiQ's advertising solutions. Sales Representatives : We help the sales team by creating insights based on the key market trends and events. Our analysis helps identify potential opportunities and develop a gripping sales narrative. Additionally, we assist in responding to Request for Proposals (RFPs) by providing data-driven insights and recommendations that help us in increasing the revenue streams. Agencies & Clients : Our expertise in data analytics and data sciences is invaluable for agency and advertiser clients. By providing detailed analysis reports & solutions, we empower them to make informed decisions regarding their marketing strategies. Our insights help clients optimize their advertising budgets, target the right audience, and maximize the effectiveness of their campaigns. Additionally, we promote MiQ's internal solutions and capabilities, showcasing MiQ's unique value proposition in the programmatic landscape. In summary, as a Senior Manager, you add value by building strong partnerships with leaders in these key teams and collectively build market strategies that foster business growth. You also guide the DnA team to build data-driven insights and recommendations to traders, account managers, sales teams, and agency/advertiser clients that empowers MiQ and its stakeholders reach the right audience with right content at the right time. What You’ll Bring 10+ years’ industry experience experience in business analytics or analytics consulting Proven leadership and people management experience. 5+ years developing the careers of 8 or more direct reports. A Bachelor’s Degree in Computer Science, Mathematical or Statistical sciences or related quantitative disciplines is required. Strong analytical acumen and problem-solving abilities to address complex client problems leveraging data Expertise in SQL, Excel and PowerPoint High degree of comfort with either R or Python Good understanding of Statistical concepts Knowledge of big data processing tools/frameworks like Qubole / Databricks /Spark, AWS Excellent Storytelling and visualization skills Programmatic Media / Ad-Tech /Digital advertising domain knowledge Knowledge of Tableau/PowerBI/Google Data Studio Ability to thrive in an unstructured environment, working autonomously on a strong team to find opportunity and deliver business impact We’ve highlighted some key skills, experience and requirements for this role. But please don’t worry if you don’t meet every single one. Our talent team strives to find the best people. They might see something in your background that’s a fit for this role, or another opportunity at MiQ. If you have a passion for the role, please still apply. What impact will you create? As a Senior Manager, your role will create value for MiQ in the following ways: Driving client stickiness: With your analytics expertise you will help our stakeholders make informed data-driven decisions. By providing accurate and actionable insights, you contribute to improving campaign performance and identifying new opportunities thereby improving customer stickiness. Driving Profitability: By leveraging the power of data you are expected to identify areas where we can optimize costs & thereby maintain a competitive edge MiQ Growth: Being on top of market trends & developments to suggest strategic measures that can help support MiQ's business & tap into new revenue streams to drive growth Support Key Decision Making: Your expertise in data analysis and reporting provides decision-makers with the necessary information to make informed choices. You will help guide agencies, advertisers & internal stakeholders in making strategic and tactical decisions that align with the MiQ's or client's objectives. Analytics Best Practices: As a Senior Manager for Analytics Excellence, you are expected to introduce analytics & data best practices within the team, helping in setting up structures and quality frameworks within the team & internal stakeholders Developing Custom Analytics Solutions: Leveraging your experience in data science & advanced analytics, you will be expected to provide recommendations on MiQ products & assist in enhancing their consumption within the target market What’s in it for you? Our Center of Excellence is the very heart of MiQ, and it’s where the magic happens. It means everything you do and everything you create will have a huge impact across our entire global business. MiQ is incredibly proud to foster a welcoming culture. We do everything possible to make sure everyone feels valued for what they bring. With global teams committed to diversity, equity, and inclusion, we’re always moving towards becoming an even better place to work. Values Our values are so much more than statements . They unite MiQers in every corner of the world. They shape the way we work and the decisions we make. And they inspire us to stay true to ourselves and to aim for better. Our values are there to be embraced by everyone, so that we naturally live and breathe them. Just like inclusivity, our values flow through everything we do - no matter how big or small. We do what we love - Passion We figure it out - Determination We anticipate the unexpected - Agility We always unite - Unite We dare to be unconventional - Courage Benefits Every region and office have specific perks and benefits, but every person joining MiQ can expect: A hybrid work environment New hire orientation with job specific onboarding and training Internal and global mobility opportunities Competitive healthcare benefits Bonus and performance incentives Generous annual PTO paid parental leave, with two additional paid days to acknowledge holidays, cultural events, or inclusion initiatives. Employee resource groups designed to connect people across all MiQ regions, drive action, and support our communities. Apply today! Equal Opportunity Employer Show more Show less
Posted 14 hours ago
0 years
5 - 7 Lacs
Pune
Remote
Entity: Finance Job Family Group: Business Support Group Job Description: We are a global energy business involved in every aspect of the energy system. We are working towards delivering light, heat and mobility to millions of people, every day. In India, we operate bp’s FBT, which is a coordinated part of bp. Our people want to play their part in solving the big, sophisticated challenges facing our world today and, guided by our bp values, are working to help meet the world’s need for more energy while lowering carbon emissions. In our offices at Pune, we work in customer service, finance, accounting, procurement, HR services and other enabling functions – providing solutions across all bp. Would you like to discover how our diverse, hardworking people are owning the way in making energy cleaner and better – and how you can play your part in our outstanding team? Join our team, and develop your career in an encouraging, forward-thinking environment! Key Accountabilities Data Quality/Modelling/Design thinking: Demonstrating SAP MDG/ECCs experience the candidate is able to investigate to do root cause analysis for assigned use cases. Also able to work with Azure data lake (via Databricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to supervise on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed relevant for each individual business. Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with leading implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which critical metrics/Measures are stood up that champion into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality important metrics/Measures is needed. Also has experience owing and completing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further mentorship/customer concern. Interpersonal skills are significant in this role as this is outward facing and focus has to be on clearly articulation messages. Dashboarding & Workflow: Builds and maintains effective analytics and partner concern mechanisms which detect poor data and help business lines drive resolution Support crafting, building and deployment of data quality dashboards via PowerBI Resolves critical issue paths and constructs workflow and alerts which advise process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) DQ Improvement Plans: Creates, embeds and drives business ownership of DQ improvement plans Works with business functions and projects to create data quality improvement plans Sets targets for data improvements .Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape Project Delivery: Oversees, advises Data Quality Analysts and participates in delivery of data quality activities including profiling, establishing conversion criteria and resolving technical and business DQ issues Owns and develops relevant data quality work products as part of the DAS data change methodology Ensures data quality aspects are delivered as part of Gold and Silver data related change projects Supports the creation of cases with insight into the cost of poor data Crucial Experience and Job Requirements: 11-15 total yrs of experience in Oil & Gas or a Financial Services/Banking industry within Data Management space Experience of working with Data Models/Structures and investigating to design and fine tune them Experience of Data Quality Management i.e. Governance, DQI management (root cause analysis, Remediation /solution identification), Governance Forums (papers production, quorum maintenance, Minutes publication), CDE identification, Data Lineage (identification of authoritative data sources) preferred. Understand of important metrics/Measures needed as well Experience of having worked with senior partners in multiple Data Domain/Business Areas, CDO and Technology. Ability to operate in global teams within multiple time zones Ability to operate in a multifaceted and changing setup and be able to identify priorities. Also ability to operate independently without too much direction Desirable criteria SAP MDG/SAP ECC experience (T codes, Tables structures etc) Azure Data lake /AWS/Data Bricks Crafting dashboards & workflows (powerBI QlikView or Tableau etc.) Crafting analytics and insight in a DQ setting (PowerBI/power Query) Profiling and analysis skills (SAP DI, Informatica or Collibra) Persuading, influencing and communication at a senior level management level. Certification in Data Management, Data Science, Python/R desirable Travel Requirement No travel is expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 14 hours ago
0 years
5 - 7 Lacs
Pune
Remote
Entity: Finance Job Family Group: Business Support Group Job Description: We are a global energy business involved in every aspect of the energy system. We are working towards delivering light, heat and mobility to millions of people, every day. In India, we operate bp’s FBT, which is a coordinated part of bp. Our people want to play their part in solving the big, sophisticated challenges facing our world today and, guided by our bp values, are working to help meet the world’s need for more energy while lowering carbon emissions. In our offices at Pune, we work in customer service, finance, accounting, procurement, HR services and other enabling functions – providing solutions across all bp. Would you like to discover how our diverse, hardworking people are owning the way in making energy cleaner and better – and how you can play your part in our outstanding team? Join our team, and develop your career in an encouraging, forward-thinking environment! Key Accountabilities Data Quality/Modelling/Design thinking: Demonstrating SAP MDG/ECCs experience the candidate is able to investigate to do root cause analysis for assigned use cases. Also able to work with Azure data lake (via Databricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to supervise on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed relevant for each individual business. Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with leading implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which critical metrics/Measures are stood up that champion into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality important metrics/Measures is needed. Also has experience owing and completing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further mentorship/customer concern. Interpersonal skills are significant in this role as this is outward facing and focus has to be on clearly articulation messages. Dashboarding & Workflow: Builds and maintains effective analytics and partner concern mechanisms which detect poor data and help business lines drive resolution Support crafting, building and deployment of data quality dashboards via PowerBI Resolves critical issue paths and constructs workflow and alerts which advise process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) DQ Improvement Plans: Creates, embeds and drives business ownership of DQ improvement plans Works with business functions and projects to create data quality improvement plans Sets targets for data improvements .Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape Project Delivery: Oversees, advises Data Quality Analysts and participates in delivery of data quality activities including profiling, establishing conversion criteria and resolving technical and business DQ issues Owns and develops relevant data quality work products as part of the DAS data change methodology Ensures data quality aspects are delivered as part of Gold and Silver data related change projects Supports the creation of cases with insight into the cost of poor data Crucial Experience and Job Requirements: 11-15 total yrs of experience in Oil & Gas or a Financial Services/Banking industry within Data Management space Experience of working with Data Models/Structures and investigating to design and fine tune them Experience of Data Quality Management i.e. Governance, DQI management (root cause analysis, Remediation /solution identification), Governance Forums (papers production, quorum maintenance, Minutes publication), CDE identification, Data Lineage (identification of authoritative data sources) preferred. Understand of important metrics/Measures needed as well Experience of having worked with senior partners in multiple Data Domain/Business Areas, CDO and Technology. Ability to operate in global teams within multiple time zones Ability to operate in a multifaceted and changing setup and be able to identify priorities. Also ability to operate independently without too much direction Desirable criteria SAP MDG/SAP ECC experience (T codes, Tables structures etc) Azure Data lake /AWS/Data Bricks Crafting dashboards & workflows (powerBI QlikView or Tableau etc.) Crafting analytics and insight in a DQ setting (PowerBI/power Query) Profiling and analysis skills (SAP DI, Informatica or Collibra) Persuading, influencing and communication at a senior level management level. Certification in Data Management, Data Science, Python/R desirable Travel Requirement No travel is expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 14 hours ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley. Company Worley Primary Location IND-AP-Hyderabad Job Digital Solutions Schedule Full-time Employment Type Agency Contractor Job Level Experienced Job Posting Jun 16, 2025 Unposting Date Jul 16, 2025 Reporting Manager Title Senior General Manager Show more Show less
Posted 14 hours ago
3.0 - 4.0 years
0 - 0 Lacs
Chennai
On-site
Company Description Founded in 2014 and headquartered in Perungudi, Chennai, SkyRich Tech Solutions is a technology-driven company empowering manufacturing ecosystems. We assist enterprises across various industries including Automotive, Manufacturing, Energy, Logistics, and Supply Chain in transitioning to Industry 4.0. Our expertise spans AI-powered automation, MES, IIoT, Edge Computing, and real-time analytics, providing end-to-end visibility, actionable insights, and process optimization. As an official partner of Databricks, we deliver data-driven solutions that enhance operational efficiency and intelligence. Role Description This is a full-time on-site role for an Odoo Developer located in Chennai. The Odoo Developer will be responsible for developing, customizing, and deploying Odoo ERP systems. Day-to-day tasks include back-end development, module development, integrating Odoo with other systems, and optimizing performance. The role involves collaborating with cross-functional teams to gather requirements and implement solutions that align with business goals. Generic requirement Develop and customize Odoo ERP modules to meet the specific needs of the company Write clean, maintainable, and efficient code Collaborate with other developers, project managers, and stakeholders to identify and Develop new features Troubleshoot and debug issues in existing modules Work with the team to continuously improve software development processes and practices Write technical documentation for the modules developed Development requirement: 3-4 years of experience in Odoo development and customization strong knowledge of Python programming language and related frameworks and libraries Experience with web development (JavaScript, HTML, CSS) Experience with PostgreSQL Experience with Linux server administration Strong analytical and problem-solving skills Excellent communication skills and ability to work in a team environment Experience with Git and version control, Push and pull ,and branch maintenance. Experience with front end development using JavaScript frameworks such as React, Vue.js or Angular. Should know the data migration methods in Odoo. Should be well aware about the Odoo 18 latest Json “action” changes and automation's. Server Knowledge of odoo.sh, Odoo installation. Odoo installation in other Servers like AWS etc is an added advantage. Project competencies Knowledge of agile software development methodologies Familiarity with other ERP systems and frameworks ERPNext knowledge will be considered a Plus Experience with Odoo 15 and latest version. Knowledge of business processes such as accounting, inventory, and sales on. On the technical side such as the model overview Understanding of business processes and workflows. Ability to work in a fast-paced environment and manage multiple tasks. Job Types: Full-time, Permanent Pay: ₹18,000.00 - ₹50,000.00 per month Location Type: In-person Schedule: Monday to Friday Work Location: In person
Posted 14 hours ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Our Client: Distinguished Founders | Team Encultured: Transparent, "No-Heroes", Shared Ownership, Continuous Learning | Unique Serial Entrepreneurs | Multiple High-Value World Famous Exits The Role: You will lead the development of a high-scale AI Prediction Platform powering critical decisions. You will lead engineering for a data-intensive product - owning architecture, team growth, and platform scalability. What You’ll Own End-to-end development of the AI Prediction Platform-architecture, code quality, performance, and system integration. Direct management of a 10–15 member engineering team (scaling to ~20). Set direction, grow leaders, and foster a high-performance culture rooted in shared ownership and transparency. Translate business priorities into robust technical execution across product, design, and data functions (North America + India). Serve as the technical face of engineering internally and externally-owning escalations, technical positioning, and stakeholder trust. Technical Scope Tech Stack: React (TypeScript), FastAPI, Python, Databricks, Dagster, Terraform, AWS, dltHub, Nixtla, LangChain/LangGraph. Tools & Standards: Jest, Playwright, Pytest, Azure DevOps, Docker, Checkov, SonarCloud. Deep experience with full-stack engineering, distributed systems, and scalable data pipelines is essential. Hands-on background with modern SaaS architecture, TDD, and infrastructure as code. What We’re Looking For 10+ years of engineering experience with 5+ years leading engineering teams or teams-of-teams. Proven success building complex B2B or enterprise SaaS products at scale. Strong recent hands-on experience (Python, SQL, React, etc.) with architectural and production ownership. Experience managing and growing distributed engineering teams. Deep understanding of system design, DevOps culture, and AI/ML-enabled platforms. Strong cross-functional leadership with clarity in communication and execution alignment. Write to sanish@careerxperts.com to get connected! Show more Show less
Posted 15 hours ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About American Airlines: To Care for People on Life's Journey®. Together with our American Eagle regional partners, we offer thousands of flights daily to more than 350 destinations in more than 60 countries. American Airlines is transforming the way it delivers technology to its customers and team members worldwide. American’s Tech Hub in Hyderabad, India, is our latest technology office location and home to team members who drive technical innovation and engineer unrivalled digital products to best serve American’s customers and team members. With U.S. tech hubs in Dallas-Fort Worth, Texas and Phoenix, Arizona, our new team in Hyderabad, India enables better support of our 24/7 operation and positions American to deliver industry-leading technology solutions that create a world-class customer experience. Cloud Engineering What you'll do: As noted above, this list is intended to reflect the current job but there may be additional essential functions (and certainly non-essential job functions) that are not referenced. Management will modify the job or require other tasks be performed whenever it is deemed appropriate to do so, observing, of course, any legal obligations including any collective bargaining obligations. Be a part of Business Intelligence Platform team and ensure all our systems up and running and performing optimally- Cognos, PowerBI, Tableau, Alteryx and Grafana. Support automation of Platform Infrastructure related processes using PowerShell, Python, other tools to help platform stability and scalability. Perform troubleshooting of platform related issues and other complex issues with cloud BI solutions. Windows & Linux servers, IIS, Application Gateways, Firewall and Networks, Complex SQL, etc. Perform multiple aspects involved in the development lifecycle – design, cloud engineering (Infrastructure, network, security, and administration, data modeling, testing, performance tuning, deployments, consumption, BI, alerting, prod support. Provide technical leadership and collaborate within a team environment as well as work independently. Be a part of a DevOps team that completely owns and supports their product. Leads development of coding standards, best practices and privacy and security guidelines. Make sure the systems are security compliant and patched as per Cybersecurity guidelines All you'll need for success: Minimum Qualifications - Education & Prior Job Experience: Bachelor’s degree in computer science, Computer Engineering, Technology, Information Systems (CIS/MIS), Engineering or related technical discipline, or equivalent experience/training 3 years business intelligence development using agile, DevOps, operating in a product model that includes designing, developing, and implementing large-scale applications or data engineering solutions. 3 years data analytics experience using SQL. 2 years of cloud development and data lake experience (prefer Microsoft Azure) including Azure EventHub, Azure Data Factory, Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Data Lake, Azure Power Apps and Power BI. Combination of Development, Administration & Support experience in several of the following tools/platforms required: Scripting: Python, SQL, PowerShell Basic Azure Infrastructure Experience: Servers, Networking, Firewall, Storage Account, App Gateways etc. CI/CD: GitHub, Azure DevOps, Terraform BI Analytics Tool Administration on anyone of the platforms - Cognos, Tableau, Power BI, Alteryx Preferred Qualifications - Education & Prior Job Experience: 3+ years data analytics experience specifically in Business Intelligence Development, Requirements gathering and training end users. 3+ years administering data platforms (Tableau or Cognos or Power BI) at scale. 3+ years analytics solution development using agile, dev ops, product model that includes designing, developing, and implementing large-scale applications or data engineering solutions. Airline Industry Experience Skills, Licenses & Certifications: Certification in any BI tools - Administration Expertise with the Azure Technology stack for data management, data ingestion, capture, processing, curation and creating consumption layers. Expertise in providing practical direction within the Azure Native cloud services. Show more Show less
Posted 15 hours ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
The SAS to Databricks Migration Developer will be responsible for migrating existing SAS code, data processes, and workflows to the Databricks platform. This role requires expertise in both SAS and Databricks, with a focus on converting SAS logic into scalable PySpark and Python code. The developer will design, implement, and optimize data pipelines, ensuring seamless integration and functionality within the Databricks environment. Collaboration with various teams is essential to understand data requirements and deliver solutions that meet business needs Show more Show less
Posted 15 hours ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role : ETL Test Analyst with Databricks JOB LOCATION : Bangalore, Bhubaneswar EXPERIENCE REQUIREMENT : 4+ Technical Skill Set : ETL Testing, Strong SQL, Data bricks Must have: Good experience in developing SQL queries for testing complex ETL transformations Proficiency in ETL Testing, Data Validation, Data Warehouse Testing Proficiency in writing and executing SQL queries for data validation and testing Good knowledge of Data warehousing QA concepts Hands on Experience on Databricks Thorough Understanding of Databricks-based ETL workflows, including data ingestion, transformation, and validation. Knowledge in Test Strategy, Test Plan, Test Case creation, STLC, Bug Life Cycle Experience with Agile & QA tools Secondary Skills. Good to have: Knowledge in Pyspark Basic understanding of Dax code in Power BI Proficient knowledge in Azure DevOps, SDLC, STLC, DevOps and CI/CD processes. Show more Show less
Posted 15 hours ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview C5i C5i is a pure-play AI & Analytics provider that combines the power of human perspective with AI technology to deliver trustworthy intelligence. The company drives value through a comprehensive solution set, integrating multifunctional teams that have technical and business domain expertise with a robust suite of products, solutions, and accelerators tailored for various horizontal and industry-specific use cases. At the core, C5i’s focus is to deliver business impact at speed and scale by driving adoption of AI-assisted decision-making. C5i caters to some of the world’s largest enterprises, including many Fortune 500 companies. The company’s clients span Technology, Media, and Telecom (TMT), Pharma & Lifesciences, CPG, Retail, Banking, and other sectors. C5i has been recognized by leading industry analysts like Gartner and Forrester for its Analytics and AI capabilities and proprietary AI-based platforms. Global offices United States | Canada | United Kingdom | United Arab of Emirates | India Job Summary We are looking for experienced Data Modelers to support large-scale data engineering and analytics initiatives. The role involves developing logical and physical data models, working closely with business and engineering teams to define data requirements, and ensuring alignment with enterprise standards. • Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, Spark, Data Bricks Delta Lakehouse or other Cloud data warehousing technologies. • Governs data design/modelling – documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. • Develop a deep understanding of the business domains like Customer, Sales, Finance, Supplier, and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. • Drive collaborative reviews of data model design, code, data, security features to drive data product development. • Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; SAP Data Model. • Develop reusable data models based on cloud-centric, code-first approaches to data management and data mapping. • Partner with the data stewards team for data discovery and action by business customers and stakeholders. • Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. • Assist with data planning, sourcing, collection, profiling, and transformation. • Support data lineage and mapping of source system data to canonical data stores. • Create Source to Target Mappings (STTM) for ETL and BI developers. Skills needed: • Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) • Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. • Experience with version control systems like GitHub and deployment & CI tools. • Experience of metadata management, data lineage, and data glossaries is a plus. • Working knowledge of agile development, including DevOps and DataOps concepts. • Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail. C5i is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, color, religion, sex, sexual orientation, age, marital status, disability, gender identity, etc. If you have a disability or special need that requires accommodation, please keep us informed about the same at the hiring stages for us to factor necessary accommodations. Show more Show less
Posted 15 hours ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are looking for a Senior DevOps Engineer to join our Life Sciences & Healthcare DevOps team . This is an exciting opportunity to work on cutting-edge Life Sciences and Healthcare products in a DevOps environment. If you love coding in Python or any scripting language, have experience with Linux, and ideally have worked in a cloud environment, we’d love to hear from you! We specialize in container orchestration, Terraform, Datadog, Jenkins, Databricks, and various AWS services. If you have experience in these areas, we’d be eager to connect with you. About You – Experience, Education, Skills, And Accomplishments At least 7+ years of professional software development experience and 5+ years as DevOps Engineer or similar skillsets with experience on various CI/CD and configuration management tools e.g., Jenkins, Maven, Gradle, Jenkins, Spinnaker, Docker, Packer, Ansible, Cloudformation, Terraform, or similar CI/CD orchestrator tool(s). At least 3+ years of AWS experience managing resources in some subset of the following services: S3, ECS, RDS, EC2, IAM, OpenSearch Service, Route53, VPC, CloudFront, Glue and Lambda. 5+ years of experience with Bash/Python scripting. Wide knowledge in operating system administration, programming languages, cloud platform deployment, and networking protocols Be on-call as needed for critical production issues. Good understanding of SDLC, patching, releases, and basic systems administration activities It would be great if you also had AWS Solution Architect Certifications Python programming experience. What will you be doing in this role? Design, develop and maintain the product's cloud infrastructure architecture, including microservices, as well as developing infrastructure-as-code and automated scripts meant for building or deploying workloads in various environments through CI/CD pipelines. Collaborate with the rest of the Technology engineering team, the cloud operations team and application teams to provide end-to-end infrastructure setup Design and deploy secure, resilient, and scalable Infrastructure as Code per our developer requirements while upholding the InfoSec and Infrastructure guardrails through code. Keep up with industry best practices, trends, and standards and identifies automation opportunities, designs, and develops automation solutions that improve operations, efficiency, security, and visibility. Ownership and accountability of the performance, availability, security, and reliability of the product/s running across public cloud and multiple regions worldwide. Document solutions and maintain technical specifications Product you will be developing The Products rely on container orchestration (AWS ECS,EKS), Jenkins, various AWS services (such as Opensearch, S3, IAM, EC2, RDS,VPC, Route53, Lambda, Cloudfront), Databricks, Datadog, Terraform and you will be working to support the Development team build them. About The Team Life Sciences & HealthCare Content DevOps team mainly focus on DevOps operations on Production infrastructure related to Life Sciences & HealthCare Content products. Our team consists of five members and reports to the DevOps Manager. We as a team provides DevOps support for almost 40+ different application products internal to Clarivate and which are source for customer facing products. Also, responsible for Change process on production environment. Incident Management and Monitoring. Team also handles customer raised /internal user service requests. Hours of Work Shift timing 12PM to 9PM. Must provide On-call support during non-business hours per week based on team bandwidth At Clarivate, we are committed to providing equal employment opportunities for all qualified persons with respect to hiring, compensation, promotion, training, and other terms, conditions, and privileges of employment. We comply with applicable laws and regulations governing non-discrimination in all locations. Show more Show less
Posted 15 hours ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Total 6 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred Technical And Professional Experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Show more Show less
Posted 15 hours ago
4.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred Technical And Professional Experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Show more Show less
Posted 15 hours ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Requisition Number: 101352 Architect I - Data Location: This is a hybrid opportunity in Delhi-NCR, Bangalore, Hyderabad, Gurugram area. Insight at a Glance 14,000+ engaged teammates globally with operations in 25 countries across the globe. Received 35+ industry and partner awards in the past year $9.2 billion in revenue #20 on Fortune’s World's Best Workplaces™ list #14 on Forbes World's Best Employers in IT – 2023 #23 on Forbes Best Employers for Women in IT- 2023 $1.4M+ total charitable contributions in 2023 by Insight globally Now is the time to bring your expertise to Insight. We are not just a tech company; we are a people-first company. We believe that by unlocking the power of people and technology, we can accelerate transformation and achieve extraordinary results. As a Fortune 500 Solutions Integrator with deep expertise in cloud, data, AI, cybersecurity, and intelligent edge, we guide organisations through complex digital decisions. About The Role As an Architect I , you will focus on leading our Business Intelligence (BI) and Data Warehousing (DW) initiatives. We will count on you to be involved in designing and implementing end-to-end data pipelines using cloud services and data frameworks. Along the way, you will get to: Architect and implement end-to-end data pipelines, data lakes, and warehouses using modern cloud services and architectural patterns. Develop and build analytics tools that deliver actionable insights to the business. Integrate and manage large, complex data sets to meet strategic business requirements. Optimize data processing workflows using frameworks such as PySpark. Establish and enforce best practices for data quality, integrity, security, and performance across the entire data ecosystem. Collaborate with cross-functional teams to prioritize deliverables and design solutions. Develop compelling business cases and return on investment (ROI) analyses to support strategic initiatives. Drive process improvements for enhanced data delivery speed and reliability. Provide technical leadership, training, and mentorship to team members, promoting a culture of excellence. What We’re Looking For 8+ years in Business Intelligence (BI) solution design, with 6+ years specializing in ETL processes and data warehouse architecture. 6+ years of hands-on experience with Azure Data services including Azure Data Factory, Azure Databricks, Azure Data Lake Gen2, Azure SQL DB, Synapse, Power BI, and MS Fabric. Strong Python and PySpark software engineering proficiency, coupled with a proven track record of building and optimizing big data pipelines, architectures, and datasets. Proficient in transforming, processing, and extracting insights from vast, disparate datasets, and building robust data pipelines for metadata, dependency, and workload management. Familiarity with software development lifecycles/methodologies, particularly Agile. Experience with SAP/ERP/Datasphere data modeling is a significant plus. Excellent presentation and collaboration skills, capable of creating formal documentation and supporting cross-functional teams in a dynamic environment. Strong problem-solving, time management, and organizational abilities. Keen to learn new languages and technologies continually. Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or an equivalent field What You Can Expect We’re legendary for taking care of you, your family and to help you engage with your local community. We want you to enjoy a full, meaningful life and own your career at Insight. Some of our benefits include: Freedom to work from another location—even an international destination—for up to 30 consecutive calendar days per year. Medical Insurance Health Benefits Professional Development: Learning Platform and Certificate Reimbursement Shift Allowance But what really sets us apart are our core values of Hunger, Heart, and Harmony, which guide everything we do, from building relationships with teammates, partners, and clients to making a positive impact in our communities. Join us today, your ambITious journey starts here. When you apply, please tell us the pronouns you use and any reasonable adjustments you may need during the interview process. At Insight, we celebrate diversity of skills and experience so even if you don’t feel like your skills are a perfect match - we still want to hear from you! Today's Talent Leads Tomorrow's Success. Learn More About Insight https://www.linkedin.com/company/insight/ Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. Insight India Location:Level 16, Tower B, Building No 14, Dlf Cyber City In It/Ites Sez, Sector 24 &25 A Gurugram Gurgaon Hr 122002 India Show more Show less
Posted 15 hours ago
0 years
0 Lacs
Raipur, Chhattisgarh, India
On-site
Role Summary We are seeking a highly motivated and skilled Data Engineer to join our data and analytics team. This role is ideal for someone with strong experience in building scalable data pipelines, working with modern lakehouse architectures, and deploying data solutions on Microsoft Azure. You’ll be instrumental in developing, orchestrating, and maintaining our real-time and batch data infrastructure using tools like Apache Spark, Apache Kafka, Apache Airflow, Azure Data Services, and modern DevOps practices. Key Responsibilities Design and implement ETL/ELT data pipelines for structured and unstructured data using Azure Data Factory, Databricks, or Apache Spark. Work with Azure Blob Storage, Data Lake, and Synapse Analytics to build scalable data lakes and warehouses. Develop real-time data ingestion pipelines using Apache Kafka, Apache Flink, or Apache Beam. Build and schedule jobs using orchestration tools like Apache Airflow or Dagster. Perform data modeling using Kimball methodology for building dimensional models in Snowflake or other data warehouses. Implement data versioning and transformation using DBT and Apache Iceberg or Delta Lake. Manage data cataloging and lineage using tools like Marquez or Collibra. Collaborate with DevOps teams to containerize solutions using Docker, manage infrastructure with Terraform, and deploy on Kubernetes. Setup and maintain monitoring and alerting systems using Prometheus and Grafana for performance and reliability. Required Skills & Qualifications Programming & Scripting: Proficiency in Python, with strong knowledge of OOP and data structures & algorithms. Comfortable working in Linux environments for development and deployment. Database Technologies: Strong command over SQL and understanding of relational (DBMS) and NoSQL databases. Big Data & Real-Time Processing: Solid experience with Apache Spark (PySpark/Scala). Familiarity with real-time processing tools like Kafka, Flink, or Beam. Orchestration & Scheduling: Hands-on experience with Airflow, Dagster, or similar orchestration tools. Cloud Platform: Deep experience with Microsoft Azure, especially Azure Data Factory, Blob Storage, Synapse, Azure Functions, etc. AZ-900 or other Azure certifications are a plus. Lakehouse & Warehousing Knowledge of dimensional modeling, Snowflake, Apache Iceberg, and Delta Lake. Understanding of modern Lakehouse architecture and related best practices. Data Cataloging & Governance Familiarity with Marquez, Collibra, or other cataloging tools. DevOps & CI/CD Experience with Terraform, Docker, Kubernetes, and Jenkins or equivalent CI/CD tools. Monitoring & Logging Proficiency in setting up dashboards and alerts with Prometheus and Grafana. Note: - Immediate joiner will be preferred. Show more Show less
Posted 16 hours ago
10.0 years
0 Lacs
India
On-site
We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. About the Role The candidate will be responsible for leading data modeling initiatives and ensuring compliance with healthcare regulations while collaborating with various stakeholders to translate business requirements into technical solutions. Responsibilities: Data Architecture & Modeling Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management. Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment). Create and maintain data lineage documentation and data dictionaries for healthcare datasets. Establish data modeling standards and best practices across the organization. Technical Leadership Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica. Architect scalable data solutions that handle large volumes of healthcare transactional data. Collaborate with data engineers to optimize data pipelines and ensure data quality. Healthcare Domain Expertise Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI). Design data models that support analytical, reporting and AI/ML needs. Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations. Partner with business stakeholders to translate healthcare business requirements into technical data solutions. Data Governance & Quality Implement data governance frameworks specific to healthcare data privacy and security requirements. Establish data quality monitoring and validation processes for critical health plan metrics. Lead efforts to standardize healthcare data definitions across multiple systems and data sources. Required Qualifications: Technical Skills 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data. Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches. Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing. Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks). Proficiency with data modeling tools (Hackolade, ERwin, or similar). Healthcare Industry Knowledge Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data. Experience with healthcare data standards and medical coding systems. Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment). Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI). Leadership & Communication Proven track record of leading data modeling projects in complex healthcare environments. Strong analytical and problem-solving skills with ability to work with ambiguous requirements. Excellent communication skills with ability to explain technical concepts to business stakeholders. Experience mentoring team members and establishing technical standards. Preferred Qualifications Experience with Medicare Advantage, Medicaid, or Commercial health plan operations. Cloud platform certifications (AWS, Azure, or GCP). Experience with real-time data streaming and modern data lake architectures. Knowledge of machine learning applications in healthcare analytics. Previous experience in a lead or architect role within healthcare organizations. Show more Show less
Posted 16 hours ago
7.0 years
0 Lacs
India
Remote
About Lemongrass Lemongrass is a software-enabled services provider, synonymous with SAP on Cloud, focused on delivering superior, highly automated Managed Services to Enterprise customers. Our customers span multiple verticals and geographies across the Americas, EMEA and APAC. We partner with AWS, SAP, Microsoft and other global technology leaders. We are seeking an experienced Cloud Data Engineer with a strong background in AWS, Azure, and GCP. The ideal candidate will have extensive experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, and other ETL tools like Informatica, SAP Data Intelligence, etc. You will be responsible for designing, implementing, and maintaining robust data pipelines and building scalable data lakes. Experience with various data platforms like Redshift, Snowflake, Databricks, Synapse, Snowflake and others is essential. Familiarity with data extraction from SAP or ERP systems is a plus. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Qualifications Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. Advanced degrees are a plus. Experience: 7+ years of experience as a Data Engineer or in a similar role. Proven experience with cloud platforms: AWS, Azure, and GCP. Hands-on experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc. Experience with other ETL tools like Informatica, SAP Data Intelligence, etc. Experience in building and managing data lakes and data warehouses. Proficiency with data platforms like Redshift, Snowflake, BigQuery, Databricks, and Azure Synapse. Experience with data extraction from SAP or ERP systems is a plus. Strong experience with Spark and Scala for data processing. Skills: Strong programming skills in Python, Java, or Scala. Proficient in SQL and query optimization techniques. Familiarity with data modeling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications: Experience with other data tools and technologies such as Apache Spark, or Hadoop. Certifications in cloud platforms (AWS Certified Data Analytics – Specialty, Google Professional Data Engineer, Microsoft Certified: Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps practices for data engineering Selected applicant will be subject to a background investigation, which will be conducted and the results of which will be used in compliance with applicable law. What we offer in return: Remote Working: Lemongrass always has been and always will offer 100% remote work Flexibility: Work where and when you like most of the time Training: A subscription to A Cloud Guru and generous budget for taking certifications and other resources you’ll find helpful State of the art tech: An opportunity to learn and run the latest industry standard tools Team: Colleagues who will challenge you giving the chance to learn from them and them from you Lemongrass Consulting is proud to be an Equal Opportunity and Affirmative Action employer. We do not discriminate on the basis of race, religion, color, national origin, religious creed, gender, sexual orientation, gender identity, gender expression, age, genetic information, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics Show more Show less
Posted 16 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2