Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position Description Job Title: Data EngineerExperience Level: 5+ YearsLocation: Hyderabad Job Summary We are looking for a seasoned and innovative Senior Data Engineer to join our dynamic data team. This role is ideal for professionals with a strong foundation in data engineering, coupled with hands-on experience in machine learning workflows, statistical analysis, and big data technologies. You will play a critical role in building scalable data pipelines, enabling advanced analytics, and supporting data science initiatives. Proficiency in Python is essential, and experience with PySpark is a strong plus. Key Responsibilities Data Pipeline Development: Design and implement scalable, high-performance ETL/ELT pipelines using Python and PySpark. ML & Statistical Integration: Collaborate with data scientists to integrate machine learning models and statistical analysis into data workflows. Data Modeling: Create and optimize data models (relational, dimensional, and columnar) to support analytics and ML use cases. Big Data Infrastructure: Manage and optimize data platforms such as Snowflake, Redshift, BigQuery, and Databricks. Performance Tuning: Monitor and enhance the performance of data pipelines and queries. Data Governance: Ensure data quality, integrity, and compliance through robust governance practices. Cross-functional Collaboration: Partner with analysts, scientists, and product teams to translate business needs into technical solutions. Automation & Monitoring: Automate data workflows and implement monitoring and alerting systems. Mentorship: Guide junior engineers and promote best practices in data engineering and ML integration. Innovation: Stay current with emerging technologies in data engineering, ML, and analytics. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field. 5+ years of experience in data engineering with a strong focus on Python and big data tools. Solid understanding of machine learning concepts and statistical analysis techniques. Proficiency in SQL and Python; experience with PySpark is highly desirable. Experience with cloud platforms (AWS, Azure, or GCP) and data tools (e.g., Glue, Data Factory, Dataflow). Familiarity with data warehousing and lakehouse architectures. Knowledge of data modeling techniques (e.g., star schema, snowflake schema). Experience with version control systems like Git. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and collaboration skills. Your future duties and responsibilities Required Qualifications To Be Successful In This Role Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 2 days ago
0 years
0 Lacs
India
Remote
REQUIREMENT Position: Azure consultant with Data Warehouse and ETL experience Location: REMOTE (India) Duration: 1 - 2 Months (Short-Term) Skills & JD Azure, ETL, DWH, MS SQL experience writing stored procedures, creating facts and dimension tables etc. --------- Deepaks@vedasoftinc.com
Posted 2 days ago
8.0 - 10.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
P2-C2-STS JD Strong SQL skills to perform database queries, data validations, and data integrity checks. Familiarity with relational databases and data management concepts. Working experience with cloud-based data warehouse platforms like Snowflake and AWS. Experience in creating and implementing ETL testing strategy Experience in data integrity, data accuracy and completeness testing Proficient in source to target mapping validation test cases Proficient in Test planning, Test design, Test execution, Test management, preferably in healthcare payor domain Lead ETL testing and data migration projects from QA perspective, ensuring accuracy and completeness. Validated data pipelines in order to maintaining data integrity. Performed BI report validation on Power BI for a Enterprise level Sales and Assets dashboard which has number of important KPIs, ensuring insights are accurate and actionable. Executed automation framework for data validation and reconciliation. Interact with business stakeholders and give UAT support to them during UAT cycle.Write complex SQL queries on Snowflake in order to maintain data quality. Maintain test cases on JIRA and Zephyr. Attend all the scrum ceremonies like Sprint review meetings, daily standups. Mandatory Skills 8 to10 years of ETL Testing experience Snowflake and AWS. Business intelligence and Data warehouse testing SQL queries and testing data flow across the data layers testing data quality, data integrity, data reconciliation understanding on Data warehouse working with Agile teams ETL testing strategy
Posted 2 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your Primary Responsibilities Include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Preferred Education Master's Degree Required Technical And Professional Expertise Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems. Implement data quality and validation processes within Ab Initio.. Data Modeling and Analysis:. Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes.. Analyze and model data to ensure optimal ETL design and performance.. Ab Initio Components:. . Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions.. Implement best practices for reusable Ab Initio components Preferred Technical And Professional Experience Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration: Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes. Participate in design reviews and provide technical expertise to enhance overall solution quality Documentation
Posted 2 days ago
0 years
0 Lacs
Delhi, India
On-site
Description Skills Required: Bash/Shell scripting Git Hub ETL Apache Spark Data validation strategies Docker & Kubernetes (for containerized deployments) Monitoring tools: Prometheus, Grafana Strong in python Grafana-Prometheus, PowerBI/Tableau (important) Requirements Extensive hands-on experience implementing data migration and data processing Strong Experience implementing ETL/ELT processes and building data pipelines including workflow management, job scheduling and monitoring Experience with building and implementing Big Data platforms On-Prem or On Cloud, covering ingestion (Batch and Real-time), processing (Batch and real-time), Polyglot Storage, Data Access Good understanding of Data Warehouse, Data Governance, Data Security, Data Compliance, Data Quality, Meta Data Management, Master Data Management, Data Catalog Proven understanding and demonstrable implementation experience of big data platform technologies on the cloud (AWS and Azure) including surrounding services like IAM, SSO, Cluster monitoring, Log Analytics, etc. Experience with source code management tools such as TFS or Git Knowledge of DevOps with CICD pipeline setup and automate Building and integrating systems to meet the business needs Defining features, phases, and solution requirements and providing specifications accordingly Experience building stream-processing systems, using solutions such as Azure Even Hub/ Kafka etc. Strong experience with data modeling and schema design Strong knowledge in SQL and no-sql Database and/or BI/DW. Excellent interpersonal and teamwork skills Experience With Leading And Mentorship Of Other Team Members Good knowledge of Agile Scrum Good communication skills Strong analytical, logic and quantitative ability. Takes ownership of a task. Values accountability and responsibility. Quick learner Job responsibilities ETL/ELT processes, data pipelines, Big Data platforms (On-Prem/Cloud), data ingestion (Batch/Real-time), data processing, Polyglot Storage, Data Governance, Cloud (AWS/Azure), IAM, SSO, Cluster monitoring, Log Analytics, source code management (Git/TFS), DevOps, CICD automation, stream processing (Kafka, Azure Event Hub), data modeling, schema design, SQL/NoSQL, BI/DW, Agile Scrum, team leadership, communication, analytical skills, ownership, quick learner What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 2 days ago
7.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
About This Role At BlackRock, we are looking for a Senior Data Engineer who enjoys building and supporting high impact data pipelines to solve complex challenges while working closely with your colleagues throughout the business. We recognize that strength comes from diversity, and will embrace your outstanding skills, curiosity, drive, and passion while giving you the opportunity to grow technically while learning from hands-on leaders in technology and finance. With over USD $11 trillion of assets we have an outstanding responsibility: our technology empowers millions of investors to save for retirement, pay for college, buy a home and improve their financial wellbeing. Being a financial technologist at BlackRock means you get the best of both worlds: working for one of the most successful financial companies and also working in a software development team responsible for next generation technology and solutions. We are seeking a high-reaching individual to help drive financial data engineering projects, initially focusing on our Index Fixed Income Group for the BGM DnA ("Data and Analytics") team in India. We are a community of highly qualified Data Engineers, Content & DevOps Specialists who have a passion for working on data solutions that help drive the agenda for our business partners Our team is based in San Francisco, London & Hungary, and we will complete the global circle with a new engineering team in Mumbai. About BlackRock Global Markets BlackRock Global Markets (“BGM”) functions are at the core of BlackRock’s markets and investments platform, including ETF and Index Investments (“Engine”), Global Trading, Securities Lending, Fixed Income, Liquidity and Financing. BGM is passionate about advancing the investment processes and platform architecture in these areas and on ensuring we engage with other market participants in a collaborative, strategic way. You should be Someone who is passionate about solving sophisticated business problems through data! Capable of overseeing the design, implementation, and optimization of data pipelines, ETL processes, and data storage solutions Able to work closely with multi-functional teams (e.g., Data Science, Product, Analytics, and Citizen Developer teams) to ensure the data infrastructure meets business needs. Enthusiastic about establishing and maintaining standard methodologies for data engineering, focusing on data quality, security, and scalability Key Requirements 7+ years Data Engineering experience preferably in the financial sector Familiarity with various aspects of Fixed Income Index and Market Data including ICE, Bloomberg, JP Morgan, FTSE/Russell, and IBOXX. Liquidity, Venue, and Direct Broker Dealer Market Maker Axe Data. Pricing Data from sources like S&P Global Live Bond Pricing or Bloombergs IBVAL. Understand Portfolio Management Fundamentals: Asset Management and Fixed Income Trading. A passion for Financial and Capital Markets. Proven experience working in an agile development team. Strong problem solving skills. Strong SQL and Python skills with a proven track record optimizing SQL queries. Curiosity of financial markets. We value if you have Bachelor’s degree in Computer Science, Engineering, Finance, Economics, or a related field. A Master’s degree or equivalent experience is a plus. Knowledge of Linux and scripting languages such as Bash Experience with MySQL, PostgreSQL, Greenplum, Snowflake or similar databases. Strong experience with ETL/ELT tools like DBT, Pentaho, Informatica or similar technologies. Experience with DevOps and tools like Azure DevOps Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Posted 2 days ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Skills Required: Bash/Shell scripting Git Hub ETL Apache Spark Data validation strategies Docker & Kubernetes (for containerized deployments) Monitoring tools: Prometheus, Grafana Strong in python Grafana-Prometheus, PowerBI/Tableau (important) Requirements Extensive hands-on experience implementing data migration and data processing Strong Experience implementing ETL/ELT processes and building data pipelines including workflow management, job scheduling and monitoring Experience with building and implementing Big Data platforms On-Prem or On Cloud, covering ingestion (Batch and Real-time), processing (Batch and real-time), Polyglot Storage, Data Access Good understanding of Data Warehouse, Data Governance, Data Security, Data Compliance, Data Quality, Meta Data Management, Master Data Management, Data Catalog Proven understanding and demonstrable implementation experience of big data platform technologies on the cloud (AWS and Azure) including surrounding services like IAM, SSO, Cluster monitoring, Log Analytics, etc. Experience with source code management tools such as TFS or Git Knowledge of DevOps with CICD pipeline setup and automate Building and integrating systems to meet the business needs Defining features, phases, and solution requirements and providing specifications accordingly Experience building stream-processing systems, using solutions such as Azure Even Hub/ Kafka etc. Strong experience with data modeling and schema design Strong knowledge in SQL and no-sql Database and/or BI/DW. Excellent interpersonal and teamwork skills Experience With Leading And Mentorship Of Other Team Members Good knowledge of Agile Scrum Good communication skills Strong analytical, logic and quantitative ability. Takes ownership of a task. Values accountability and responsibility. Quick learner Job responsibilities ETL/ELT processes, data pipelines, Big Data platforms (On-Prem/Cloud), data ingestion (Batch/Real-time), data processing, Polyglot Storage, Data Governance, Cloud (AWS/Azure), IAM, SSO, Cluster monitoring, Log Analytics, source code management (Git/TFS), DevOps, CICD automation, stream processing (Kafka, Azure Event Hub), data modeling, schema design, SQL/NoSQL, BI/DW, Agile Scrum, team leadership, communication, analytical skills, ownership, quick learner What we offer Culture of caring. At GlobalLogic, we prioritize a culture of caring. Across every region and department, at every level, we consistently put people first. From day one, you’ll experience an inclusive culture of acceptance and belonging, where you’ll have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Learning and development. We are committed to your continuous learning and development. You’ll learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. With our Career Navigator tool as just one example, GlobalLogic offers a rich array of programs, training curricula, and hands-on opportunities to grow personally and professionally. Interesting & meaningful work. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you’ll have the chance to work on projects that matter. Each is a unique opportunity to engage your curiosity and creative problem-solving skills as you help clients reimagine what’s possible and bring new solutions to market. In the process, you’ll have the privilege of working on some of the most cutting-edge and impactful solutions shaping the world today. Balance and flexibility. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. Your life extends beyond the office, and we always do our best to help you integrate and balance the best of work and life, having fun along the way! High-trust organization. We are a high-trust organization where integrity is key. By joining GlobalLogic, you’re placing your trust in a safe, reliable, and ethical global company. Integrity and trust are a cornerstone of our value proposition to our employees and clients. You will find truthfulness, candor, and integrity in everything we do. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world’s largest and most forward-thinking companies. Since 2000, we’ve been at the forefront of the digital revolution – helping create some of the most innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.
Posted 2 days ago
9.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Job Summary: We are looking for an experienced and technically strong Cloud Infrastructure Automation Engineer to join our team. The ideal candidate will have 9+ years of overall cloud experience , including 5+ years of automation experience , and will be responsible for building, automating, and maintaining robust infrastructure on Oracle Cloud Infrastructure (OCI) . The role includes end-to-end automation using Terraform , scripting, CI/CD integration, and operational excellence using modern DevOps practices. Exposure to other cloud platforms (AWS, Azure), container orchestration (Kubernetes/OKE), open-source monitoring, and security frameworks is highly desirable. Key Responsibilities: Design, automate, and manage OCI infrastructure using Terraform and Infrastructure as Code principles. Develop and integrate CI/CD pipelines using tools like Jenkins, Git, GitHub Actions, or GitLab CI/CD. Deploy and manage containerized applications using Kubernetes, preferably Oracle Kubernetes Engine (OKE). Implement monitoring solutions using Prometheus, Grafana, and other open-source observability tools. Automate infrastructure provisioning and system configuration using Bash, Python, or Shell scripting. Architect and implement secure cloud environments, ensuring best practices in networking, identity and access management, and data protection. Design and support cloud security frameworks, applying zero-trust principles and governance models. Collaborate in cross-functional teams to provide guidance on cloud architecture, automation patterns, and security controls. Troubleshoot and resolve infrastructure and deployment issues efficiently in production and non-production environments. Participate in planning and architecture discussions to deliver robust and scalable infrastructure solutions. Required Qualifications: 9+ years of overall cloud experience, with 5+ years in cloud automation. Proven hands-on experience with Oracle Cloud Infrastructure (OCI). Strong expertise in Terraform for provisioning OCI resources. High proficiency in scripting and programming languages (e.g., Bash, Python, Shell). Solid experience deploying and managing workloads on Kubernetes, ideally on OKE. Experience building monitoring dashboards and alerts using Prometheus and Grafana. Strong understanding of cloud networking, security, and IAM models. Hands-on experience in designing cloud architecture and developing secure infrastructure frameworks. Familiarity with modern CI/CD and DevOps tools and methodologies. Strong analytical, troubleshooting, and communication skills. Preferred Skills (Good to Have): Experience with AWS or Azure cloud platforms. Familiarity with ETL workflows and container lifecycle management (e.g., Docker). Exposure to secrets management, policy enforcement, and compliance automation. Knowledge of service mesh, ingress controllers, and advanced Kubernetes patterns. Certifications (Preferred): OCI Architect/Infrastructure Certification HashiCorp Terraform Associate DevOps/CI-CD certifications (e.g., CKA/CKAD) Security-related certifications (e.g., CCSP, OCI Security, CISSP) Career Level - IC3 Responsibilities Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 2 days ago
3.0 - 5.0 years
0 Lacs
Haryana, India
Remote
We’re AtkinsRéalis, a world class Engineering Services and Nuclear organization. We connect people, data and technology to transform the world's infrastructure and energy systems. Together, with our industry partners and clients, and our global team of consultants, designers, engineers and project managers, we can change the world. Created by the integration of long-standing organizations dating back to 1911, we are a world-leading professional services company dedicated to engineering a better future for our planet and its people. We deploy global capabilities locally to our clients and deliver unique end-to-end services across the whole life cycle of an asset including consulting, advisory & environmental services, intelligent networks & cybersecurity, design & engineering, procurement, project & construction management, operations & maintenance, decommissioning and capital. The breadth and depth of our capabilities are delivered to clients in key strategic sectors. News and information are available at www.atkinsrealis.com or follow us on LinkedIn. Our teams take great pride in delivering some of the world’s most prestigious projects. This success is driven by our talented people, whose diverse perspectives, expertise, and knowledge set us apart. Join us and you'll be part of our genuinely collaborative environment, where everyone is supported to make the most of their talents and expertise. When it comes to work-life balance, AtkinsRéalis is a great place to be. So, let's discuss how our flexible and remote working policies can support your priorities. We're passionate about are work while valuing each other equally. So, ask us about some of our recent pledges for Women's Equality and being a 'Disability Confident' and 'Inclusive Employer’. Job Summary Passionate problem solver who works with customers and team alike to keep systems available, you have exceptional communication skills, and penchant to identify and resolve issues in a timely manner. Key Experiences And Requirements Should have 3-5 years of experience Experienced working on cloud services Experienced in database operation Experienced working in production support, ops support & maintenance projects Experienced in ticket management in a ITSM / helpdesk system Analyse and resolve production & non-production issues/incidents within set SLAs. Identify the severity level of the customer-impacting issue & react accordingly. Investigate system logs to resolve production issues and restore services. Provide RCA for production issues to avoid recurrence. Regularly communicate incident/request status to impact teams/customers Deploy, monitor & troubleshoot new software versions in production & non-Production environments. System debugging experience and skills Work closely with development and QA team to address software bugs Identify trends in reported issues, report them and suggest possible solutions. Experienced in DevOps Identify/Design procedures to eliminate manual processes in the team. Excellent analytical and problem-solving skills Understands software architecture Knowledge of Windows, Networks, Firewalls, File Shares, DNS, TCP/IP Knowledge of IT infrastructure Good reporting skills to report and explain the issues in clear and efficient way Good documentation skills to write structured technical documents with diagrams and detailed explanations. System (server, network and application) monitoring experience Good interpersonal communication skills with proficiency in written and spoken English Good Team player If need be, should be ready to develop software enhancements and/or carry out bugfixes Willingness to learn new technologies Mandatory Key Skills: Microsoft .Net, C#, ASP.Net Core and WebAPI ReactJS, HTML SQL Server and Postgres Cloud Services (Preferably Microsoft Azure) Familiarity with networking systems and protocols. Git PowerShell, and Bash Docker / Azure containers Desired MySQL and Mongo Experienced working in production support, ops support & maintenance projects GitHub Testing approaches such as TDD and BDD (Nunit, Jest) Good knowledge of Python and Django Experienced with Azure DevOps Exposure to an Agile Methodologies GIS system ITIL process familiarity ETL Qualification And Certification B.E CS, Graduate with relevant experience IT industry certifications such as Microsoft Azure would be added advantage What We Can Offer You Varied, interesting and meaningful work. A hybrid working environment with flexibility and great opportunities. Opportunities for training and, as the team grows, career progression or sideways moves. An opportunity to work within a large global multi-disciplinary consultancy on a mission to change the ways we approach business as usual. Why work for AtkinsRéalis? We at AtkinsRéalis are committed to developing its people both personally and professionally. Our colleagues have the advantage of access to a high ranging training portfolio and development activities designed to help make the best of individual’s abilities and talents. We also actively support staff in achieving corporate membership of relevant institutions. Meeting Your Needs To help you get the most out of life in and outside of work, we offer employees ‘Total Reward’. Making sure you're supported is important to us. So, if you identify as having a disability, tell us ahead of your interview, and we’ll discuss any adjustments you might need. Additional Information We are an equal opportunity, drug-free employer committed to promoting a diverse and inclusive community - a place where we can all be ourselves, thrive and develop. To help embed inclusion for all, from day one, we offer a range of family friendly, inclusive employment policies, flexible working arrangements and employee networks to support staff from different backgrounds. As an Equal Opportunities Employer, we value applications from all backgrounds, cultures and ability. We care about your privacy and are committed to protecting your privacy. Please consult our Privacy Notice on our Careers site to know more about how we collect, use and transfer your Personal Data. Link: Equality, diversity & inclusion | Atkins India (atkinsrealis.com)
Posted 2 days ago
9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Job Summary: We are looking for an experienced and technically strong Cloud Infrastructure Automation Engineer to join our team. The ideal candidate will have 9+ years of overall cloud experience , including 5+ years of automation experience , and will be responsible for building, automating, and maintaining robust infrastructure on Oracle Cloud Infrastructure (OCI) . The role includes end-to-end automation using Terraform , scripting, CI/CD integration, and operational excellence using modern DevOps practices. Exposure to other cloud platforms (AWS, Azure), container orchestration (Kubernetes/OKE), open-source monitoring, and security frameworks is highly desirable. Key Responsibilities: Design, automate, and manage OCI infrastructure using Terraform and Infrastructure as Code principles. Develop and integrate CI/CD pipelines using tools like Jenkins, Git, GitHub Actions, or GitLab CI/CD. Deploy and manage containerized applications using Kubernetes, preferably Oracle Kubernetes Engine (OKE). Implement monitoring solutions using Prometheus, Grafana, and other open-source observability tools. Automate infrastructure provisioning and system configuration using Bash, Python, or Shell scripting. Architect and implement secure cloud environments, ensuring best practices in networking, identity and access management, and data protection. Design and support cloud security frameworks, applying zero-trust principles and governance models. Collaborate in cross-functional teams to provide guidance on cloud architecture, automation patterns, and security controls. Troubleshoot and resolve infrastructure and deployment issues efficiently in production and non-production environments. Participate in planning and architecture discussions to deliver robust and scalable infrastructure solutions. Required Qualifications: 9+ years of overall cloud experience, with 5+ years in cloud automation. Proven hands-on experience with Oracle Cloud Infrastructure (OCI). Strong expertise in Terraform for provisioning OCI resources. High proficiency in scripting and programming languages (e.g., Bash, Python, Shell). Solid experience deploying and managing workloads on Kubernetes, ideally on OKE. Experience building monitoring dashboards and alerts using Prometheus and Grafana. Strong understanding of cloud networking, security, and IAM models. Hands-on experience in designing cloud architecture and developing secure infrastructure frameworks. Familiarity with modern CI/CD and DevOps tools and methodologies. Strong analytical, troubleshooting, and communication skills. Preferred Skills (Good to Have): Experience with AWS or Azure cloud platforms. Familiarity with ETL workflows and container lifecycle management (e.g., Docker). Exposure to secrets management, policy enforcement, and compliance automation. Knowledge of service mesh, ingress controllers, and advanced Kubernetes patterns. Certifications (Preferred): OCI Architect/Infrastructure Certification HashiCorp Terraform Associate DevOps/CI-CD certifications (e.g., CKA/CKAD) Security-related certifications (e.g., CCSP, OCI Security, CISSP) Career Level - IC3 Responsibilities Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 2 days ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
OneMagnify is a global performance marketing organization working at the intersection of brand marketing, technology, and analytics. The Company’s core offerings accelerate business, amplify real-time results, and help set their clients apart from their competitors. OneMagnify partners with clients to design, implement and manage marketing and brand strategies using analytical and predictive data models that provide valuable customer insights to drive higher levels of sales conversion. OneMagnify’s commitment to employee growth and development extends far beyond typical approaches. We take great pride in fostering an environment where each of our 700+ colleagues can thrive and achieve their personal best. OneMagnify has been recognized as a Top Workplace, Best Workplace and Cool Workplace in the United States for 10 consecutive years and recently was recognized as a Top Workplace in India. OneMagnify needs a Senior Snowflake Data Engineer to join our team. We blend brand marketing, technology, and analytics to boost business and client competitiveness. At OneMagnify, we are dedicated to encouraging an environment where every individual can thrive and achieve their personal best. We have been consistently recognized as a Top Workplace, Best Workplace, and Cool Workplace in the United States for the past 10 years. Join our team and be a part of our outstanding culture! Location: Chennai, India Join OneMagnify and be part of a world-class team that competes at the highest level. Apply now and take your career to new heights! Responsibilities: Design and develop scalable and efficient data solutions leveraging both on-premise SQL Server and Azure Data Services. Collaborate with cross-functional teams to define, implement, and optimize hybrid data architectures. Ensure data performance, security, and integrity across SQL Server, Azure Synapse Analytics, Azure Data Factory, and Azure SQL Database. Implement data migration and integration projects between on-premise SQL Server and Azure cloud environments. Develop and maintain ETL pipelines using Azure Data Factory, SQL Server Integration Services (SSIS), Databricks, or Synapse Pipelines. Provide technical mentorship and support to junior developers in both on-prem and cloud data environments. Work closely with analytics and business intelligence teams to ensure effective data modeling, reporting, and governance across platforms. Requirements: Bachelor’s degree in Information Technology, Computer Science, or a related field, or equivalent experience. Proven experience in a Senior Data Engineer or similar role with expertise in both on-premise SQL Server and Azure cloud technologies. Strong proficiency in SQL, database administration, and performance tuning across SQL Server and Azure databases. Experience with Azure Synapse Analytics, Azure Data Factory, Azure SQL Database, and SQL Server Integration Services (SSIS). Familiarity with data lakes, structured and unstructured data processing, and hybrid data architectures. Knowledge of Azure security standard methodologies and data governance principles. Strong problem-solving skills and attention to detail. Ability to work in a fast-paced, collaborative environment. Strong written and verbal communication skills. Benefits We offer a comprehensive benefits package including Medical Insurance, PF, Gratuity, paid holidays, and more. About Us Whether it’s awareness, advocacy, engagement, or efficacy, we move brands forward with work that connects with audiences and delivers results. Through meaningful analytics, engaging communications and innovative technology solutions, we help clients tackle their most ambitious projects and overcome their biggest challenges. We are an equal opportunity employer We believe that Innovative ideas and solutions start with unique perspectives. That’s why we’re committed to providing every employee a workplace that’s free of discrimination and intolerance. We’re proud to be an equal opportunity employer and actively search for like-minded people to join our team
Posted 2 days ago
2.0 - 4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
At Citi we’re not just building technology, we’re building the future of banking. Encompassing a broad range of specialties, roles, and cultures, our teams are creating innovations used across the globe. Citi is constantly growing and progressing through our technology, with laser focused on evolving the ways of doing things. As one of the world’s most global banks we’re changing how the world does business Shape your Career with Citi We’re currently looking for a high caliber professional to join our team as 25883567 Officer- ETL Automation tester -QA - C10 -Hybrid- PUNE based in Pune/Chennai, India. Being part of our team means that we’ll provide you with the resources to meet your unique needs, empower you to make healthy decision and manage your financial well-being to help plan for your future. For instance: We provide programs and services for your physical and mental well-being including access to telehealth options, health advocates, confidential counseling and more. Coverage varies by country. We empower our employees to manage their financial well-being and help them plan for the future. We provide access to an array of learning and development resources to help broaden and deepen your skills and knowledge as your career progresses. The Testing Analyst is a developing professional role. Applies specialty area knowledge in monitoring, assessing, analyzing and/or evaluating processes and data. Identifies policy gaps and formulates policies. Interprets data and makes recommendations. Researches and interprets factual information. Identifies inconsistencies in data or results, defines business issues and formulates recommendations on policies, procedures or practices. Integrates established disciplinary knowledge within own specialty area with basic understanding of related industry practices. Good understanding of how the team interacts with others in accomplishing the objectives of the area. Develops working knowledge of industry practices and standards. Limited but direct impact on the business through the quality of the tasks/services provided. Impact of the job holder is restricted to own team. Candidate is expected to Build Data Pipelines: Extract data from various sources (like databases and data lakes), clean and transform it, and load it into target systems Testing and Validation: Develop automated tests to ensure the data pipelines are working correctly and the data is accurate. This is like quality control, making sure everything meets the bank's standards Work with Hive, HDFS, and Oracle data sources to extract, transform, and load large-scale datasets Leverage AWS services such as S3, Lambda, and Airflow for data ingestion, event-driven processing, and orchestration Create reusable frameworks, libraries, and templates to accelerate automation and testing of ETL jobs Participate in code reviews, CI/CD pipelines , and maintain best practices in Spark and cloud-native development Ensures tooling can be run in CICD providing real-time on demand test execution shortening the feedback loop to fully support Handsfree execution Regression , Integration, Sanity testing, Regression automated suites, reports issues – provide solutions and ensures timely completion Own and drive automation in Data and Analytics Team to achieve 90% automation in Data, ETL space. Design and develop integrated portal to consolidate utilities and cater to user needs. Supports initiatives related to automation on Data & Analytics testing requirements for process and product rollout into production. Specialists who can work with technology team to design and implement appropriate automation scripts/plans for an application testing, meeting required KPI and automation effectiveness. Ensures new utilities are documented and transitioned to testers for execution and supports for troubleshooting in case required. Monitors and reviews code check-ins from peers and helps maintain project repository. Ability to work independently as well as collaborate within groups on various projects assigned. Ability to work in a fast-paced, dynamic environment and manage multiple priorities effectively. Experience and understanding of Wealth domain specifically in private bank(banking) , lending services and related Tech applications.Supports and contributes to automated test data generation and sufficiency. Successful candidate ideally would have following skills and exposure: 2 - 4 years of experience on automation testing across UI Experience in Automation ETL Testing , testing by using SQL queries. Hands on experience on Selenium BDD Cucumber using Java, Python Extensive knowledge on developing and maintaining automation frameworks, AI/ ML related solutions. Experience on automating BI reports e.g., Tableau dashboards and views validation. Data analytics and BI reports in the Financial Service industry Hands on experience in Python for developing utilities for Data Analysis using Pandas, NumPy etc. Exposure and some experience on AI related solutions, ML which can help automate faster. Experience with mobile testing using perfecto, API Testing-SoapUI, Postman/Rest Assured will be added advantage. Detailed knowledge data flows in relational database and Bigdata systems Strong knowledge of Oracle SQL and HiveQL and understanding of ETL/Data Testing. Experience with CI/CD tools like Jenkins. Proficiency in working on Cloudera Hadoop ecosystem (HDFS, Hive, YARN) Hands-on experience with ETL automation and validation framework. Solid understanding of AWS services like S3, Lambda, EKS, Airflow, and Strong problem-solving and debugging skills Excellent communication and collaboration abilities to lead and mentor a large techno-functional team across different geographical locations Strong Acumen and presentation skills. Able to work in an Agile environment and deliver results independently Education: Bachelor’s/University degree or equivalent experience ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Technology Quality ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 days ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About The Role Uber for Business is one of the fast growing businesses in Uber with a strong growth trajectory. The candidate will drive the tech strategy and execution for building experiences for small and medium sized businesses to become and sustain as Uber for Business customers . What You Will Do ---- Architect, design, and develop robust backend services and scalable APIs. Ensure the scalability, performance, and reliability of software applications. Conduct code reviews, design discussions, and technical mentorship. Collaborate with cross-functional teams to deliver comprehensive, end-to-end solutions. Stay current with emerging technologies and industry trends to drive innovation. Troubleshoot and resolve critical issues in production and development environments. Provide strategic technical leadership to influence the direction of Uber's technology stack. Develop and maintain comprehensive documentation for software projects and processes. What You Will Need ---- 10+ years of experience in full-stack software development. Expert proficiency in backend technologies such as Java, Python, Node.js, or Go. Deep understanding of database technologies, including SQL and NoSQL databases. Strong knowledge of data engineering principles and ETL processes Experience with designing and developing RESTful APIs. Mastery of version control systems such as Git. Exceptional problem-solving skills and ability to lead collaborative teams. Excellent communication skills, both verbal and written. Preferred Qualifications Ideal candidate is someone who has worked around AI integration into chatbots. Someone who understands LLM and how RAG can be leveraged to provide best in class customer experience.
Posted 2 days ago
100.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About H.E. Services: At H.E. Services vibrant tech Center in Hyderabad, you will have the opportunity to contribute to technology innovation for Holman Automotive, a leading American fleet management and automotive services company. Our goal is to continue investing in people, processes, and facilities to ensure expansion in a way that allows us to support our customers and develop new tech solutions. Holman has come a long way during its first 100 years in business. The automotive markets Holman serves include fleet management and leasing; vehicle fabrication and up fitting; component manufacturing and productivity solutions; powertrain distribution and logistics services; commercial and personal insurance and risk management; and retail automotive sales as one of the largest privately owned dealership groups in the United States. Join us and be part of a team that's transforming the way Holman operates, creating a more efficient, data-driven, and customer-centric future. Roles & Responsibilities: Design, develop, and maintain data pipelines using Databricks , Spark , and other Azure cloud technologies. Optimize data pipelines for performance, scalability, and reliability, ensuring high speed and availability of data warehouse performance. Develop and maintain ETL processes using Databricks and Azure Data Factory for real-time or trigger-based data replication. Ensure data quality and integrity throughout the data lifecycle, implementing new data validation methods and analysis tools. Collaborate with data scientists, analysts, and stakeholders to understand and meet their data needs. Troubleshoot and resolve data-related issues, providing root cause analysis and recommendations. Manage a centralized data warehouse in Azure SQL to create a single source of truth for organizational data, ensuring compliance with data governance and security policies. Document data pipeline specifications, requirements, and enhancements, effectively communicating with the team and management. Leverage AI/ML capabilities to create innovative data science products. Champion and maintain testing suites, code reviews, and CI/CD processes. Must Have: Strong knowledge of Databricks architecture and tools. Proficient in SQL , Python , and PySpark for querying databases and data processing. Experience with Azure Data Lake Storage (ADLS) , Blob Storage , and Azure SQL . Deep understanding of distributed computing and Spark for data processing. Experience with data integration and ETL tools, including Azure Data Factory. Advanced-level knowledge and practice of: Data warehouse and data lake concepts and architectures. Optimizing performance of databases and servers. Managing infrastructure for storage and compute resources. Writing unit tests and scripts. Git, GitHub, and CI/CD practices. Good to Have: Experience with big data technologies, such as Kafka , Hadoop , and Hive . Familiarity with Azure Databricks Medallion Architecture with DLT and Iceberg. Experience with semantic layers and reporting tools like Power BI . Relevant Work Experience: 5+ years of experience as a Data Engineer, ETL Developer, or similar role, with a focus on Databricks and Spark. Experience working on internal, business-facing teams. Familiarity with agile development environments. Education and Training: Bachelor's degree in computer science, Engineering, or a related field, or equivalent work experience.
Posted 2 days ago
100.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
A legacy of excellence, driving innovation and personalized service to create exceptional customer experiences. About H.E. Services: At H.E. Services vibrant tech center in Hyderabad, you’ll have the opportunity to contribute to technology innovation for Holman Automotive, a leading American fleet management and automotive services company. Our goal is to continue investing in people, processes, and facilities to ensure expansion in a way that allows us to support our customers and develop new tech solutions Holman has come a long way during its first 100 years in business. The automotive markets Holman serves include fleet management and leasing; vehicle fabrication and upfitting; component manufacturing and productivity solutions; powertrain distribution and logistics services; commercial and personal insurance and risk management; and retail automotive sales as one of the largest privately owned dealership groups in the United States. Join us and be part of a team that's transforming the way Holman operates, creating a more efficient, data-driven, and customer-centric future. The Business Intelligence Developer II will be responsible for designing, developing, and maintaining advanced data solutions. This role involves creating pipelines in Databricks for Silver (curated) and Gold (aggregated, high-value) layers of data, developing insightful dashboards in Power BI , and applying Machine Learning ( ML ) and Artificial Intelligence ( AI ) techniques to solve complex business problems Roles & Responsibilities: Develop and maintain data pipelines in Databricks for Silver and Gold layers, ensuring data quality and reliability. Optimize data workflows to handle large volumes of structured and unstructured data efficiently. Design and optimize Power BI semantic models, including creating star schemas, managing table relationships, and defining DAX measures to support robust reporting solutions. Create, enhance, and maintain interactive dashboards and reports in Power BI to provide actionable insights to stakeholders. Collaborate with business units to gather requirements and ensure dashboards meet user needs. Use Databricks and other platforms to build and operationalize ML/AI models to enhance decision-making. Work closely with data engineers, analysts, and business stakeholders to deliver scalable and innovative data solutions. Participate in code reviews, ensure best practices, and contribute to a culture of continuous improvement. Relevant Work Experience: 3-5 years of experience in business intelligence, data engineering, or a related role. Proficiency in Databricks (Spark, PySpark) for data processing and transformation. Strong expertise in Power BI for semantic model management, dashboarding and visualization. Experience building and deploying ML/AI models in Databricks or similar platforms. Must Technical Skills : Proficiency in SQL and Python. Solid understanding of ETL/ELT pipelines and data warehousing concepts. Familiarity with cloud platforms (e.g., Azure, AWS) and tools like Delta Lake. Git, GitHub, and CI/CD practices. Excellent problem-solving and analytical skills. Strong communication skills, with the ability to translate complex technical concepts into business-friendly language. Proven ability to work both independently and collaboratively in a fast-paced environment. Preferred Qualifications: Certifications in Power BI, Databricks, or cloud platforms. Experience with advanced analytics tools (e.g., TensorFlow, Scikit-learn, AutoML). Exposure to Agile methodologies and DevOps practices.
Posted 2 days ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the world’s largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet society’s evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobil’s affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. What role you will play in our team UDO DPF Data Engineer will lean in and own the work, connect with others, be resourceful, engage in data communities, apply technical growth, bring enthusiasm and commitment. They will be essential part of a data squad, a small composition of data gurus specifically assigned to a capability, developing domain knowledge and understanding of the data within business workflows so that each and every data product is done right and delights the customer. City: Bengaluru, Karnataka What you will do Perform ETL, ELT operations and administration using modern tools, programming languages and systems securely and in accordance with enterprise data standards Assemble, model, transform large complex sets of data that meet non-functional and functional business requirements into a format that can be analyzed Automate data processing of data from multiple data sources Develop, deploy and version control code for data consumption, reuse for APIs Employ machine learning techniques to create and sustain data structures Perform root cause analysis on external and internal processes and data to identify opportunities for improvement, resolve data quality issues Lead data-related workshops with stakeholders to capture data requirements and acceptance criteria About You Skills and Qualifications Minimum bachelor’s degree in: Data Science, Business Intelligence, Statistics, Computer Engineering or related field, or the equivalent combination of education, professional training, and work experience Min 2 years’ experience performing duties related to data engineering Advance English level Expert proficiency in at least one of these programming languages: Python, NoSQL, SQL, R, and competent in source code management Build processes supporting data transformation, data structures, metadata, dependency, and workload management Create data validation methods and data analysis tools Preferred Qualifications/ Experience Excellent problem-solving skills and ability to learn through scattered resources Automate routine tasks via scripts, code Capacity to successfully manage a pipeline of duties with minimal supervision Experience supporting and working with cross-functional teams in a dynamic environment Modify existing reports, extracts, dashboards, and cubes as necessary Commitment to operations integrity and ability to hold self and others accountable for results Data Governance skills: Data Quality Management, Metadata Management, Data Lineage & Provenance, Master Data Management (MDM), Data Cataloging Tools Experience with tools like Collibra, Alation, Azure Purview, Informatica, or Google. Data Catalog, Data Classification & Tagging Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India. Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships.
Posted 2 days ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Title: Data Engineer Location: Hyderabad, India (Onsite) Fulltime. Job Description: We are seeking an experienced Data Engineer with 5-8 years of professional experience to design, build, and optimize robust and scalable data pipelines for our SmartFM platform. The ideal candidate will be instrumental in ingesting, transforming, and managing vast amounts of operational data from various building devices, ensuring high data quality and availability for analytics and AI/ML applications. This role is critical in enabling our platform to generate actionable insights, alerts, and recommendations for optimizing facility operations. ROLES AND RESPONSIBILITIES • Design, develop, and maintain scalable and efficient data ingestion pipelines from diverse sources (e.g., IoT devices, sensors, existing systems) using technologies like IBM StreamSets, Azure Data Factory, Apache Spark, Talend Apache Flink and Kafka. • Implement robust data transformation and processing logic to clean, enrich, and structure raw data into formats suitable for analysis and machine learning models. • Manage and optimize data storage solutions, primarily within MongoDB, ensuring efficient schema design, data indexing, and query performance for large datasets. • Collaborate closely with Data Scientists to understand their data needs, provide high-quality, reliable datasets, and assist in deploying data-driven solutions. • Ensure data quality, consistency, and integrity across all data pipelines and storage systems, implementing monitoring and alerting mechanisms for data anomalies. • Work with cross-functional teams (Software Engineers, Data Scientists, Product Managers) to integrate data solutions with the React frontend and Node.js backend applications. • Contribute to the continuous improvement of data architecture, tooling, and best practices, advocating for scalable and maintainable data solutions. • Troubleshoot and resolve complex data-related issues, optimizing pipeline performance and ensuring data availability. • Stay updated with emerging data engineering technologies and trends, evaluating and recommending new tools and approaches to enhance our data capabilities. REQUIRED TECHNICAL SKILLS AND EXPERIENCE • 5-8 years of professional experience in Data Engineering or a related field. • Proven hands-on experience with data pipeline tools such as IBM StreamSets, Azure Data Factory, Apache Spark, Talend Apache Flink and Apache Kafka. • Strong expertise in database management, particularly with MongoDB, including schema design, data ingestion pipelines, and data aggregation. • Proficiency in at least one programming language commonly used in data engineering, such as Python or Java/Scala. • Experience with big data technologies and distributed processing frameworks (e.g., Apache Spark, Hadoop) is highly desirable. • Familiarity with cloud platforms (Azure, AWS, or GCP) and their data services. • Solid understanding of data warehousing concepts, ETL/ELT processes, and data modeling. • Experience with DevOps practices for data pipelines (CI/CD, monitoring, logging). • Knowledge of Node.js and React environments to facilitate seamless integration with existing applications. ADDITIONAL QUALIFICATIONS • Demonstrated expertise in written and verbal communication, adept at simplifying complex technical concepts for both technical and non-technical audiences. • Strong problem-solving and analytical skills with a meticulous approach to data quality. • Experienced in collaborating and communicating seamlessly with diverse technology roles, including development, support, and product management. • Highly motivated to acquire new skills, explore emerging technologies, and stay updated on the latest trends in data engineering and business needs. • Experience in the facility management domain or IoT data is a plus. EDUCATION REQUIREMENTS / EXPERIENCE • Bachelor’s (BE / BTech) / Master’s degree (MS/MTech) in Computer Science, Information Systems, Mathematics, Statistics, or a related quantitative field.
Posted 2 days ago
5.0 years
0 Lacs
India
Remote
Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. Veradigm Veradigm is here to transform health, insightfully. Veradigm delivers a unique combination of point-of-care clinical and financial solutions, a commitment to open interoperability, a large and diverse healthcare provider footprint, along with industry proven expert insights. We are dedicated to simplifying the complicated healthcare system with next-generation technology and solutions, transforming healthcare from the point-of-patient care to everyday life. For more information, please explore www.veradigm.com. Job Summary What will your job look like: We are seeking a detail-oriented and experienced Database and Backend Test Engineer with 5+ years of experience in testing large-scale data platforms , including Snowflake , Azure Data Services , and backend services. The ideal candidate will be responsible for validating data pipelines, backend logic, stored procedures, and integrations, ensuring the accuracy, performance, and quality of enterprise data systems. Key Responsibilities Design and implement test strategies for backend systems and data pipelines across Snowflake and Azure environments. Write and execute complex SQL queries to validate transformations, stored procedures, and data quality. Perform ETL testing, data reconciliation, schema validation, and metadata checks. Collaborate with data engineers and developers to verify pipeline performance, reliability, and scalability. Build and maintain automated test scripts using tools like pytest, dbt, or custom SQL-based frameworks. Integrate database tests into CI/CD pipelines using tools such as Azure DevOps, GitHub Actions, or Jenkins. Perform root cause analysis on data issues and communicate findings with relevant teams. Monitor and validate data processing jobs and schedule validations using Azure Data Factory, Synapse, or Data Bricks. Document test scenarios, data sets, and validation logs in a structured manner. An Ideal Candidate Will Have Required Skills & Qualifications: 5+ years of experience in database and backend testing. Strong hands-on experience with Snowflake – including data modeling, querying, and security roles. Experience with Azure data tools such as Azure SQL, Data Factory, Synapse Analytics, or Data Lake. Advanced proficiency in SQL and performance tuning. Experience with ETL/ELT testing and validation of data migration or transformation logic. Familiarity with Python or Shell scripting for data test automation. Knowledge of CI/CD integration for test automation. Strong understanding of data quality frameworks, data governance, and test reporting. Preferred Qualifications Experience with dbt, Great Expectations, or other data validation tools. Exposure to cloud storage validation (Azure Blob, ADLS). Experience in testing APIs for data services or backend integrations. Knowledge of data privacy and compliance frameworks (e.g., GDPR, HIPAA). Benefits Veradigm believes in empowering our associates with the tools and flexibility to bring the best version of themselves to work. Through our generous benefits package with an emphasis on work/life balance, we give our employees the opportunity to allow their careers to flourish. Quarterly Company-Wide Recharge Days Flexible Work Environment (Remote/Hybrid Options) Peer-based incentive “Cheer” awards “All in to Win” bonus Program Tuition Reimbursement Program To know more about the benefits and culture at Veradigm, please visit the links mentioned below: - https://veradigm.com/about-veradigm/careers/benefits/ https://veradigm.com/about-veradigm/careers/culture/ We are an Equal Opportunity Employer. No job applicant or employee shall receive less favorable treatment or be disadvantaged because of their gender, marital or family status, color, race, ethnic origin, religion, disability or age; nor be subject to less favorable treatment or be disadvantaged on any other basis prohibited by applicable law. Veradigm is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse and inclusive workforce. Thank you for reviewing this opportunity! Does this look like a great match for your skill set? If so, please scroll down and tell us more about yourself!
Posted 2 days ago
0 years
0 Lacs
India
Remote
Job Title: Azure Consultant (Data Warehouse & ETL) Location: Remote Duration: 2 Months (Contract) Job Summary: We are seeking an experienced Azure Consultant with strong expertise in Data Warehousing, ETL processes, and MS SQL development. The ideal candidate will be responsible for designing and implementing data integration solutions, optimizing data pipelines, and working extensively with SQL Server to develop stored procedures, facts, and dimension tables for business intelligence and analytics purposes. Key Responsibilities: Design and implement Data Warehouse solutions on Azure. Develop and optimize ETL workflows using Azure Data Factory or similar tools. Write complex T-SQL queries and stored procedures for data processing and transformation. Create and manage fact and dimension tables for reporting and analytics. Collaborate with business stakeholders to understand data requirements and deliver solutions. Ensure best practices in data governance, security, and performance tuning. Required Skills & Experience: Strong experience with Azure Data Services (Azure SQL Database, Azure Synapse, Azure Data Factory). Hands-on expertise in ETL development and Data Warehouse concepts. Proficiency in MS SQL Server including writing stored procedures, functions, triggers, and performance optimization. Experience creating and managing fact and dimension tables for analytics solutions. Excellent problem-solving and analytical skills. Preferred Skills (Nice to Have): Knowledge of Power BI or other reporting tools. Experience with CI/CD and DevOps practices for data solutions.
Posted 2 days ago
5.0 years
0 Lacs
Andhra Pradesh, India
On-site
At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In Oracle supply chain and operations at PwC, you will specialise in providing consulting services for Oracle supply chain and operations applications. You will analyse client needs, implement software solutions, and offer training and support for seamless integration and utilisation of Oracle supply chain and operations applications. Working in this area, you will enable clients to optimise their supply chain processes, improve operational efficiency, and achieve their strategic objectives. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Role / Job Title Exp. Sr . Associate Tower Oracle Exp : 5 years Key Skills FAW/OAC/ADW/IOT Educational Qualification BE / B Tech / ME / M Tech / B.SC / B.Com / BBA Work Location India Job Description As an Experienced Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Minimum 2 years of experience on Oracle's Cloud-based analytics platforms including OAC/ADW/ODI and/or FAW. Strong hands-on expertise in OAC including Analytics, Data Visualization, and Semantic Model Development. Very good development experience in OAC-Reports and dashboards using measures, Filters, calculated measures, calculated items etc Must be able to do Report testing process Experience migrating from OBIEE to OAC. Experience migrating between OAC Instances. Very Good Understanding of DatawareHousing Concepts and Data Warehouse modeling. Thorough handson experience on SQL(on any RDBMS Source). Able to troubleshoot report errors and issues on OAC. Hands On knowledge on Building, Analysis and visualizations based on Datasets created using SQL or Excel Data Sources. Good Knowledge on RPD Modeling and Usage of Data modelers on OAC. Able to troubleshoot report errors and issues on OBIEE/OAC and understand the tool limitations for OAC. Should have experience in performance tuning OAC Analysis, this includes analyzing the Explain Plan of the query, tuning the data model as well as making modifications to the tables such as indexing. Should have good knowledge of Coding, Debugging and Design and Documentation. Understanding of the flow of data between ERP and Data Warehouse. Preferable to Model and Build BI Publisher Reports. Any knowledge on PLSQL/ODI/Any ETL Tool would be preferable. Working on Multidimensional sources (like Essbase) is a plus. Any work on OTBI will be a plus. Expertise on the Oracle Analytics Cloud Tool Knowledge on BIApps concepts is preferable. Familiar with Upgrade activities and Issues encountered during Upgrade from OBIEE to OAC. Expertise in SQl/Knowledge of any ETL tool is preferable. Knowledge on FAW (ERP and SCM)/ADW/OAC (Classic, Data Visualization, Semantic Model Development)/ODI is plus. Use feedback and reflection to develop self-awareness, personal strengths and address development areas. Proven track record as an SME in chosen domain. Ability to come up with Client POC/POV for integrating/increasing adoption of emerging Tech. like BlockChain, AI et al with the product platform they are associated with. Mentor Junior resources within the team, conduct KSS and lessons learnt. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review. Adherence to SLAs, experience in incident management, change management and problem management. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct.. Managed Services - Application Evolution Services At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Everyday we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly-skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our client’s are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global Managed Services platform, we provide Application Evolution Services (formerly Application Managed Services), where we focus more so on the evolution of our clients’ applications and cloud portfolio. Our focus is to empower our client’s to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Application Evolution Services (AES) team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.
Posted 2 days ago
9.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Job Summary: We are looking for an experienced and technically strong Cloud Infrastructure Automation Engineer to join our team. The ideal candidate will have 9+ years of overall cloud experience , including 5+ years of automation experience , and will be responsible for building, automating, and maintaining robust infrastructure on Oracle Cloud Infrastructure (OCI) . The role includes end-to-end automation using Terraform , scripting, CI/CD integration, and operational excellence using modern DevOps practices. Exposure to other cloud platforms (AWS, Azure), container orchestration (Kubernetes/OKE), open-source monitoring, and security frameworks is highly desirable. Key Responsibilities: Design, automate, and manage OCI infrastructure using Terraform and Infrastructure as Code principles. Develop and integrate CI/CD pipelines using tools like Jenkins, Git, GitHub Actions, or GitLab CI/CD. Deploy and manage containerized applications using Kubernetes, preferably Oracle Kubernetes Engine (OKE). Implement monitoring solutions using Prometheus, Grafana, and other open-source observability tools. Automate infrastructure provisioning and system configuration using Bash, Python, or Shell scripting. Architect and implement secure cloud environments, ensuring best practices in networking, identity and access management, and data protection. Design and support cloud security frameworks, applying zero-trust principles and governance models. Collaborate in cross-functional teams to provide guidance on cloud architecture, automation patterns, and security controls. Troubleshoot and resolve infrastructure and deployment issues efficiently in production and non-production environments. Participate in planning and architecture discussions to deliver robust and scalable infrastructure solutions. Required Qualifications: 9+ years of overall cloud experience, with 5+ years in cloud automation. Proven hands-on experience with Oracle Cloud Infrastructure (OCI). Strong expertise in Terraform for provisioning OCI resources. High proficiency in scripting and programming languages (e.g., Bash, Python, Shell). Solid experience deploying and managing workloads on Kubernetes, ideally on OKE. Experience building monitoring dashboards and alerts using Prometheus and Grafana. Strong understanding of cloud networking, security, and IAM models. Hands-on experience in designing cloud architecture and developing secure infrastructure frameworks. Familiarity with modern CI/CD and DevOps tools and methodologies. Strong analytical, troubleshooting, and communication skills. Preferred Skills (Good to Have): Experience with AWS or Azure cloud platforms. Familiarity with ETL workflows and container lifecycle management (e.g., Docker). Exposure to secrets management, policy enforcement, and compliance automation. Knowledge of service mesh, ingress controllers, and advanced Kubernetes patterns. Certifications (Preferred): OCI Architect/Infrastructure Certification HashiCorp Terraform Associate DevOps/CI-CD certifications (e.g., CKA/CKAD) Security-related certifications (e.g., CCSP, OCI Security, CISSP) Career Level - IC3 Responsibilities Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 2 days ago
7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description The role aims to leverage data analysis, engineering, and AI/ML techniques to drive strategic business decisions and innovations. This position is responsible for designing and implementing scalable data pipelines, developing innovative models, and managing cloud infrastructure to ensure efficient data processing and storage. The role also involves collaborating with cross-functional teams to translate business needs into technical solutions, mentoring junior team members, and staying abreast of the latest technological advancements. Effective communication, particularly in English, is essential to articulate complex insights and foster a collaborative environment. The ultimate goal is to enhance data-driven decision-making and maintain a competitive edge through continuous improvement and innovation. Data and AI Specialist, Consulting role Key Responsibilities: Python developer experienced with Azure Cloud using Azure Data bricks for Data Science: Create models and algorithms to analyze data and solve business problems Application Architecture: Knowledge of enterprise application integration and application design Cloud Management: Knowledge of hosting and supporting applications of Azure Cloud Data Engineering: Build and maintain systems to process and store data efficiently Collaboration: Work with different teams to understand their needs and provide data solutions. Share insights through reports and presentations Research: Keep up with the latest tech trends and improve existing models and systems Mentorship : Guide and support junior team members Must have: Python development in AI / ML and Data Analysis: Strong programming skills in Python or R, SQL Proficiency in statistical analysis and machine learning techniques Hands on experience in NLP and NLU Experience with data visualization and reporting tools (e.g., Power BI) Experience with Microsoft Power Platforms and SharePoint, including (e.g., Power Automate) Hands on experience if using SharePoint for content management Data Engineering: Expertise in designing and maintaining data pipelines and ETL processes Experience with data storage solutions (e.g. Azure SQL) Understanding of data quality and governance principles Experience with Databricks for big data processing and analytics Cloud Management: Proficiency in cloud platforms (e.g., Azure) Knowledge of hosting and supporting applications of Azure Cloud Knowledge of cloud security and compliance best practices Collaboration and Communication: Experience in agile methodologies and project management tools (e.g., Jira) Strong interpersonal and communication skills Ability to translate complex technical concepts into business terms Experience working in cross-functional teams Excellent English communication skills, both written and verbal Research and Development: Ability to stay updated with the latest advancements in data science, AI/ML, and cloud technologies Experience in conducting research and improving model performance Mentorship: Experience in guiding and mentoring junior team members Ability to foster a collaborative and innovative team environment Must exhibit following core behaviors: Taking ownership / accountability of the projects assigned Qualifications Bachelor's, Master's in Computer Science, or MCA degree, Data Science, AI/ML, IT, or related fields 7-9 years of relevant experience Proficiency in Python, R, cloud platforms (Azure), and data visualization tools like Power BI Advanced certifications and experience with big data technologies, real-time data processing Excellent English communication skills
Posted 2 days ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Data Engineer 📍 Location: Gurugram, India 🕒 Experience: 6–8 years 🧑 💻 Employment Type: Full-time Key Responsibilities Design, build, and optimize scalable data pipelines to support advanced Media Mix Modeling (MMM) and Multi-Touch Attribution (MTA) models. Collaborate with Data Scientists to prepare data for training, validation, and deployment of machine learning models and statistical algorithms. Ingest and transform large volumes of structured and unstructured data from multiple sources, ensuring data quality and integrity. Partner with cross-functional teams (AdSales, Analytics, and Product) to deliver reliable data solutions that drive marketing effectiveness and campaign performance. Automate data workflows and build reusable components for model deployment, data validation, and reporting. Support data scientists with efficient access to cleaned and transformed data, optimizing for both performance and usability. Contribute to the design of a unified data architecture supporting AdTech, OTT, and digital media ecosystems . Stay updated with the latest trends in data engineering, AI-driven analytics, and cloud-native tools to improve data delivery and model deployment processes. Required Skills & Experience 6+ years of hands-on experience in Data Engineering , data analytics, or related roles. At least 3 years working in AdTech , AdSales , or digital media analytics environments. Experience supporting MMM and MTA modeling efforts with high-quality, production-ready data pipelines. Proficiency in Python , SQL , and data transformation tools; experience with R is a plus. Strong knowledge of data modeling , ETL pipelines , and handling large-scale datasets using distributed systems (e.g., Spark, AWS, or GCP). Familiarity with cloud platforms (AWS, Azure, or GCP) and data services (S3, Redshift, BigQuery, Snowflake, etc.). Experience with BI tools such as Tableau, Power BI, or Looker for report automation and insight generation. Solid understanding of statistical techniques , A/B testing , and model evaluation metrics. Excellent communication and collaboration skills to work with both technical and non-technical stakeholders. Preferred Qualifications Experience in media or OTT data environments. Exposure to machine learning model deployment , model monitoring, and MLOps practices. Knowledge of Kafka , Airflow , or dbt for orchestration and transformation.
Posted 2 days ago
3.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
The Position We are seeking a skilled Data Engineer to join our dynamic team. In this role, you will play a pivotal part in designing and implementing custom solutions that support complex financial and IP calculations, reporting, and data transformations. Your work will directly contribute to improving our clients' operational efficiency and decision-making capabilities. What You Will Do Problem-Solving: Develop innovative solutions to complex challenges in financial calculations, rights management, and process optimization. Data Engineering Solutions: Design, build, and maintain scalable data pipelines for migration, cleansing, transformation, and integration tasks, ensuring high-quality data outcomes. Database Development & Maintenance: Configure, implement, and refine stored procedures and queries to ensure optimal performance, scalability, and maintainability of database systems. ETL & Data Migration: Develop robust ETL (Extract, Transform, Load) processes that integrate data from diverse sources, ensuring seamless migration and transformation for downstream analytics and reporting. Automation & Scripting: Create and implement automated scripts and tools to streamline routine database tasks, reduce manual intervention, and improve overall operational efficiency. Collaboration: Partner with cross-functional teams to align data engineering efforts with broader business objectives and deliver seamless solutions that drive value across the organization. IP Commerce Data Expertise: Leverage deep knowledge of financial and rights data to develop creative solutions that address client needs and advance business goals. Process Improvement: Continuously identify opportunities to optimize workflows, automate repetitive tasks, and enhance efficiency in data processing and delivery. Must-Have What you will bring to the role : Minimum 3-5 years of experience in an database developer or analyst position. Bachelors in Computer Science, Engineering or equivalent work experience. Exceptional analytical thinking and problem-solving capabilities. Strong verbal and written communication skills with the ability to articulate technical concepts clearly. Proficiency in analyzing complex financial or IP data sets. Hands-on experience with engineering principles, including designing and implementing scalable solutions. Strong attention to detail and commitment to ensuring data accuracy and integrity. Preferred Experience working with SQL and/or Python for data manipulation and analysis. Experience working in finance or IP-related industries, with an understanding of their unique challenges and requirements. Familiarity with handling large-scale datasets and cloud-based platforms (e.g., AWS, Azure, Google Cloud). Knowledge of DevOps practices and CI/CD pipelines to streamline database management and deployment. Understanding of data warehousing architectures and business intelligence tools for advanced analytics. Certifications in relevant database technologies (e.g., Microsoft Certified: Azure Database Administrator Associate or Oracle Certified Professional) are a bonus Shift - Flexible (US & UK shift) Equal Employment Opportunity Rightsline is an equal opportunity workplace. All candidates will be afforded equal opportunity through the recruiting process. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, disability, gender identity and/or expression. We are dedicated to growing a diverse team of highly talented individuals and creating an inclusive environment where everyone feels empowered to bring their authentic selves to work. Apply Today If you want to join a company that strives for a mission, purpose and making an impact, we encourage you to apply today.
Posted 2 days ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Area(s) of responsibility JD- Qlik Sense Designing, developing, and maintaining interactive dashboards and reports using Qlik Sense, extracting data, and managing Qlik Sense servers, while also ensuring data integrity and performance optimization Develop Innovative and Visually Appealing Qlik Sense Dashboards and Reports that Provide Actionable Insights to Stakeholders. Good experience on offshore team lead Should have good experience on onsite and offshore model as a lead/SPOC Should be able to understand requirement by direct interact with users, create BRD, TSD, handle offshore team, provide technical support Can be able to handle end to end activities Must be good at Data transformation, the creation of QVD files and set analysis. Experienced in application designing, architecting, development and deployment using Qlik Sense. Must be efficient in front-end development and know visualization best practices. Strong database designing and SQL skills Experienced in data integration through extracting, transforming and loading (ETL) data from various sources. Able to comprehend and translate complex and advanced functional, technical and business requirements into executable architectural designs. Hands on Experience in Design, Implement, Test and Support Reports and Dashboards Within in the agreed SLA. Working Experience on charts in Qlik sense such as KPI, Line, Straight table, Pivot table, Pie, Bar, Combo and Radar, Map …etc. Strong Working Experience on SET Analysis or Set Expressions and Selection States. Working knowledge on YTD, LYTD, QTD, LQTD, MTD, LMTD, WTD, LWTD creation using Set Analysis…etc. Experience in Qlik Native Functions like String, Date, Aggregate, Row, Conditional…. Etc.
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France