Home
Jobs

13457 Etl Jobs - Page 45

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

About Us Optiply is at the forefront of three rapidly expanding sectors: Software as a Service, Artificial Intelligence, and E-commerce. With our intelligent purchasing software, we empower over 300 web shops and wholesalers to make smarter buying decisions, using predictive analytics to optimize inventory management. Job Description As an R Engineer at Optiply, you’ll be part of a dynamic and collaborative team focused on developing, maintaining, and improving our statistical and machine learning models. You’ll work closely with data scientists, backend developers, and product teams to ensure our algorithms are robust, scalable, and integrated seamlessly into our systems. This Is What You'll Be Doing Design, develop, and maintain statistical models and forecasting tools primarily using R. Collaborate with the development team to integrate R-based solutions into broader systems and workflows. Build APIs or microservices to expose R models to production systems when needed. Optimize and refactor existing R code for performance and scalability. Support data processing and ETL pipelines in collaboration with software engineers. Work with our Customer Success Team to understand product requirements and translate them into technical solutions. Ensure high standards of code quality, testing, and documentation. This Is Who We’re Looking For You have 3–6 years of professional experience in a data or engineering-focused role. Experience with forecasting models, time series analysis, or inventory optimization. Strong proficiency in R programming for statistical analysis, forecasting, or data modeling. Solid hands-on experience with Python, especially for scripting, data handling, or API development. Comfortable working with data from various sources (SQL, APIs, flat files). Familiar with DevOps tools and best practices (Docker, Git, CI/CD pipelines) is a plus. Experience working in a production environment and collaborating across teams. Self-driven, proactive, and comfortable working in a fast-paced, international environment. Nice to Have Exposure to cloud platforms (AWS, GCP, or Azure). Prior experience in a SaaS, e-commerce, or supply chain tech company. This It What We Offer Competitive Compensation Package: Reflects skills and contributions Holistic Work-Life Harmony: Values personal time and promotes a healthy work-life balance Comprehensive Health Coverage: Robust insurance plans Investment in Professional Growth: Paid training programs. Adaptable Work Hours: Flexibility in schedule Hybrid Work Model: combining remote and in-office work. Strategic Career Development: Personalized growth plans and advancement opportunities. Tailored Workspace Setup: High-quality PC, monitor, keyboard, etc. Social Fridays: Casual drinks fostering team camaraderie. This Job Description made your day? Then send us your CV in English and get prepared to meet our team! Show more Show less

Posted 4 days ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary -Explore, develop, implement and evaluate Digital innovation solutions that address customer needs. Co-create with key stakeholders to build partnerships and collaborations -Leads the delivery of multiple projects across a variety of creative and marketing channels, including print and digital media. Develop and coordinate project plans across the design, development and production stages of a project to support the successful delivery within set KPI's. -Works in collaboration with brand teams, technical teams and all functions to maximize value. -Provides consultancy, advice and assistance on strategy for commercialization of products, and influence marketing/Marketing Sales Operation team on decision making on Sales Force resource allocation in most optimal ways, through delivery of proven analytics based projects. -Provide analytics support to Novartis internal customers About The Role Position Title : Team Leader Location : Hyd About The Role 30,000!! That is the number of Novartis Field Personnel that we impact through Data and Analytics. Work closely with the BIP Function Head (and in conjunction with matrix Regional Account Director) to shape and develop the Novartis BIP function and address evolving business and customer needs. Your Key Responsibilities Include, But Are Not Limited To Manage an efficient and high-quality BIP team that promotes synergy and best practice sharing among resources, drives collaboration with Country Organizations in managing high standards of communication and delivering best in class services Ensures exemplary communication with all stakeholders including internal ICS associates, and global customers through regular local and global updates with focus on accomplishments, KPIs, best practices, staffing changes, key events, etc. Identifies and resolves operational issues, clearly articulate potential recommendations/solutions to local or global managers/partners; manage number of escalations to global office. Is able to collaborate with Media agencies, Brand , Media optimization and strategy, Resource optimization , Advance analytics to enable faster to insight data with required grains. Is proactive in planning; anticipating change and acting in accordance; drive meticulous implementation of team goals and metrics. Grooms and develops talent, implements succession planning and mentor associates for higher responsibilities. Conducts performance appraisal of team members and manage the training needs of the group. Essential Requirements For This Role Include Experience of 8-10 years of experience with advanced Visualization skills combined with Programming Languages – R, Python is needed, and good to have with Pharma analytics with a focus on Digital domain. Understanding of the US IM digital data ecosystem for integrating various data sources and building value through processed data Knowledge of advanced media concepts as well as data integrity and testing procedures for developing digital media outcomes. Hands-on knowledge of large-scale ETL (Extract, Transform, Load) and warehouse operations and experience with big data lakes for managing large amounts of data in native, raw formats beneficial for analytics, business intelligence, and machine learning applications. Profound knowledge of Marketing Cloud Intelligence metrics to integrate data from marketing and advertising platforms, web analytics, CRM, e-commerce, etc., to optimize spend and customer engagement. Why Novartis? Our purpose is to reimagine medicine to improve and extend people’s lives and our vision is to become the most valued and trusted medicines company in the world. How can we achieve this? With our people. It is our associates that drive us each day to reach our ambitions. Be a part of this mission and join us! Learn more here: https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: If this role is not suitable to your experience or career goals but you wish to stay connected to hear more about Novartis and our career opportunities, join the Novartis Network here: https://talentnetwork.novartis.com/network Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

Minimum of 5 Years of Experience in Quality Assurance, Database Testing, Data warehouse testing/ETL Testing. Experience with Data Warehouse/Data Marts IS REQUIRED. Experience with any one of the Industry ETL Tools like Informatica, Abinitio, SSIS Financial software testing experience with solid SQL/Database skills ARE REQUIRED, SQL Server, and/or Oracle preferred. Experience in knowledge of industry standard best practices as related to software testing Experience in creating test cases, scripts, use of automated test tools, testing client/server applications. Experience in working with Agile projects. Experience in handling large programs and teams. Show more Show less

Posted 4 days ago

Apply

16.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Principal Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Senior Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 16- 20 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills Possess the innate quality to become the go to person for any marketing presales and solution accelerator within the practise. What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 4 days ago

Apply

16.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Principal Cloud Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Senior Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 16- 20 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills Possess the innate quality to become the go to person for any marketing presales and solution accelerator within the practise. What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Additional Comments: Job Description Testing the number of joins/cardinality created in semantic model vs number of joins/cardinality in BO universe, also the types of joins/cardinality matches in both the models Testing derived tables logic of semantic model vs derived tables logic of BO universe Testing the alias tables of semantic model vs alias tables of BO universe Testing the numbers of measures, dimension, and filters of semantic model vs the numbers of measures, dimension, and filters of BO universe Testing the definitions of measures, dimension, and filters of semantic model vs the definitions of measures, dimension and filters of BO universe Testing of power bi paginated reports, visual reports and all other types of reports with end to end including formatting and design Testing the functionalities of parameters and filters created in each report Testing the measures, dimensions, parameters and filters logic/definition used in each report Testing the adhocs report created by using BO universe vs same adhocs report created by using semantic model giving the similar output or not Skills Quality Assurance,Business Objects,Semantic Analysis Must have : Manual Testing (ETL, Power BI, SQL) Good to have : SAP BO Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title: Tech Lead – Pre-Sales Location: Delhi NCR, India (Hybrid) Experience: 8+ years Employment Type: Full-time Industry: Technology Consulting & Data Analytics About EXL EXL is a global analytics and digital solutions company that partners with clients to improve business outcomes and unlock growth. We combine deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions. Role Overview We are seeking a dynamic Data Engineer – Tech Lead – Pre-Sales to spearhead our data engineering & GenAI initiatives and play a pivotal role in pre-sales activities. This hybrid role demands a blend of deep technical expertise in data engineering and the acumen to engage with clients, understand their needs, and craft compelling solutions that drive business growth. Additionally, the candidate will be instrumental in establishing thought leadership through newsletters, technical campaigns, and other marketing initiatives. Key Responsibilities Technical Expertise Maintain a strong understanding of data engineering concepts, tools, and best practices. Stay updated with emerging technologies in cloud platforms (AWS, Azure, GCP) and big data ecosystems. Provide technical insights to inform marketing content and client discussions. Leverage AI/ML and GenAI technologies to enhance data solutions and drive innovation. Pre-Sales & Client Engagement Partner with sales teams to identify client needs and propose tailored data and AI solutions. Develop and deliver technical presentations, demonstrations, and proof-of-concepts to prospective clients. Assist in creating proposals, RFP responses, and other sales materials that align with client requirements. Engage in client meetings to understand requirements, address concerns, and build strong relationships. Marketing & Thought Leadership Collaborate with the marketing team to develop go-to-market strategies for data and AI solutions. Author technical blogs, whitepapers, and case studies to establish thought leadership in data engineering and AI domains. Design and lead technical campaigns, webinars, and workshops targeting potential clients and industry stakeholders. Develop newsletters and other marketing collateral that translate complex technical concepts into accessible content for diverse audiences. Engage in market research to identify trends, client needs, and opportunities for EXL's data and AI solutions. Qualifications Education:Bachelor’s or Master’s degree in Engineering, or a related field. Experience:Minimum of 8 years in data engineering, with at least 2 years in a role involving marketing or pre-sales activities. Technical Skills: Proficiency in cloud platforms: one of AWS, Azure, and/or GCP, Azure preferred Strong experience with big data technologies such as Hadoop, Spark, and Kafka. Expertise in SQL and NoSQL databases. Familiarity with ETL tools and data integration techniques. Programming skills in languages like Python. Experience with LLMs & GenAI. Marketing & Communication Skills: Proven experience in creating technical marketing content (blogs, whitepapers, case studies). Develop and implement comprehensive marketing strategies to increase both brand awareness as well as more targeted user acquisition Ability to translate complex technical concepts into accessible content for diverse audiences. Experience in organizing and leading webinars, workshops, or technical campaigns. Ability to communicate complex ideas, anticipates potential objectives and persuades others, often at senior levels, to adopt a different point of view. Experience leading projects with notable risk and complexity. Experience in people management encouraged. Demonstrated excellent oral, presentation, written and interpersonal communication skills. Proven teamwork, thought leadership, and stakeholder collaboration with limited guidance. Demonstrated excellent time management and organizational skills. Ability to work to deadlines under pressure while providing meticulous attention to detail. In-depth conceptual and practical knowledge in product management and go-to-market strategies. Proficiency in Microsoft Office Suite (Excel, PowerPoint, Word, etc.). Preferred Qualifications Experience in the consulting industry or working with cross-functional teams. Familiarity with data governance, security, and compliance standards. Certifications in cloud platforms or data engineering tools. Experience collaborating with marketing teams on technical content creation and campaigns. Experience with AI/ML frameworks Knowledge of AI/ML and GenAI applications in data engineering. Show more Show less

Posted 4 days ago

Apply

170.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Overview 170+ Years Strong. Industry Leader. Global Impact. At Pinkerton, the mission is to protect our clients. To do this, we provide enterprise risk management services and programs specifically designed for each client. Pinkerton employees are one of our most important assets and critical to the delivery of world-class solutions. Bonded together, we share a commitment to integrity, vigilance, and excellence. Pinkerton is an inclusive employer who seeks candidates with diverse backgrounds, experiences, and perspectives to join our family of industry subject matter experts. The Frontend- & BI Developer will be part of a high-performing and international team with the goal to expand Data & Analytics solutions for our CRM application which is live in all Securitas countries. You will be responsible for the output to our end users by using PowerBI as tool. Design needs to be in line with the existing UX Guidelines. Communication & Advise on possible improvement in design, Row Level Security, calculated KPIs or new insights using the Fabric functionality in PowerBI. Continuous development to do better will need the ability to think bigger and work closely with the whole team. Frontend & BI developer will collaborate with the data Engineer (ETL specialist) to align on the BI CEP deliverables. Together with the input coming from the business, the frontend- & BI Developer provides advise to the Global BI Manager and the CEP business owner on what is needed to deliver to the end user in a way that will improve engagement and usability. The expectation is to always take data privacy into consideration when talking about moving data or sharing data with others. For that purpose, there is a need to enhance the security layer as agreed with the legal department. Responsibilities Represent Pinkerton’s core values of integrity, vigilance, and excellence. Building business logices using Microsoft Fabric Enhancements to existing dashboards to make it visually better using Power BI Manage access for the BI deliveries for several users within different countries Ownership on resolving the incoming tickets for both incidents and requests Understand and participate in discussions related to Data Model and Dashboards and come up with solution Validating data on dashboards using SQL Queries Maintaining of code versions Deployments to DEV, TEST and PROD Maintain & Develop relationships with the different stakeholders on both IT and business side Determining the changes needed in the BI Dashboards to accommodate new BI requirements Manage the UX Guidelines and adjust when approved by Global BI Manager Through existing expertise advise the BI Team with the best enhancements to improve our existing solution Plan activities to stay close to the business to foresee coming changes to the BI deliveries In cooperation with the Global BI Manager and the CEP Business Owner manage expectation for the end users and improve the engagement in our data-driven journey Through working with different team members improve the teamwork using the DevOps tool to keep track of the status of the deliverable from start to end Ensure understanding and visible implementation of the company’s core values of Integrity, Vigilance and Helpfulness. Knowledge about skills and experience available and required in your area today and tomorrow to drive liaison with other departments if needed. All other duties, as assigned. Qualifications Must have 5 years of experience in Microsoft Technologies (Power BI, Fabrci + SQL Server) & Front End design Mandatory to know how to write SQL queries Mandatory to have experience with BI tools and systems such as Power BI Expertise in Data Analysis / Modelling Background Business Analysis Skills & Database Background Should be able to build and create complex measures using DAX in tabular cube Should be able to create roles for security purpose using DAX Capable of implementing row-level security in Tabular Model & Power BI Should be able to develop reports, KPI scorecards, and dashboards using Power BI desktop / Power BI Services and the available Fabric features Analytical thinking for translating data into informative reports and visuals Very good communication skill to be able to communicate clearly with Clients and Business Analyst Have knowledge on how Sprints work in Azure Devops Experience from working in the intersection between I T and Business Good in assessing the differences between needs and wants Skilled in using time management in a standard day of work. An analytical mindset with superb communication and problem-solving skills Fluent in English both spoken and written Bonus: knowledge of additional language(s) Ability to communicate, present and influence credibly at all levels both internally and externally Working Conditions With or without reasonable accommodation, requires the physical and mental capacity to effectively perform all essential functions; Regular computer usage. Occasional reaching and lifting of small objects and operating office equipment. Frequent sitting, standing, and/or walking. Travel, as required. Pinkerton is an equal opportunity employer to all applicants and positions without regard to race/ethnicity, color, national origin, ancestry, sex/gender, gender identity/expression, sexual orientation, marital/prenatal status, pregnancy/childbirth or related conditions, religion, creed, age, disability, genetic information, veteran status, or any protected status by local, state, federal or country-specific law. Show more Show less

Posted 4 days ago

Apply

170.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Overview 170+ Years Strong. Industry Leader. Global Impact. At Pinkerton, the mission is to protect our clients. To do this, we provide enterprise risk management services and programs specifically designed for each client. Pinkerton employees are one of our most important assets and critical to the delivery of world-class solutions. Bonded together, we share a commitment to integrity, vigilance, and excellence. Pinkerton is an inclusive employer who seeks candidates with diverse backgrounds, experiences, and perspectives to join our family of industry subject matter experts. The Data Engineer will be part of a high-performing and international team with the goal to expand Data & Analytics solutions for our CRM application which is live in all Securitas countries. Together with the dedicated Frontend– & BI Developer you will be responsible for managing and maintaining the Databricks based BI Platform including the processes from data model changes, implementation and development of pipelines are part of the daily focus, but ETL will get most of your attention. Continuous development to do better will need the ability to think bigger and work closely with the whole team. The Data Engineer (ETL Specialist) will collaborate with the Frontend– & BI Developer to align on possibilities to improve the BI Platform deliverables specifically for the CEP organization. Cooperation with other departments such as integrations or specific IT/IS projects and business specialists is part of the job. The expectation is to always take data privacy into consideration when talking about moving data or sharing data with others. For that purpose, there is a need to develop the security layer as agreed with the legal department. Responsibilities Represent Pinkerton’s core values of integrity, vigilance, and excellence. Maintain & Develop the Databricks workspace used to host the BI CEP solution Active in advising needed changes the data model to accommodate new BI requirements Develop and implement new ETL scripts and improve the current ones Ownership on resolving the incoming tickets for both incidents and requests Plan activities to stay close to the Frontend- & BI Developer to foresee coming changes to the backend Through working with different team members improve the teamwork using the DevOps tool to keep track of the status of the deliverable from start to end Ensure understanding and visible implementation of the company’s core values of Integrity, Vigilance and Helpfulness. Knowledge about skills and experience available and required in your area today and tomorrow to drive liaison with other departments if needed. All other duties, as assigned. Qualifications At least 3+ years of experience in Data Engineering Understanding of designing and implementing data processing architectures in Azure environments Experience with different SSAS - modelling techniques (preferable Azure, databricks - Microsoft related) Understanding of data management and – treatment to secure data governance & security (Platform management and administration) An analytical mindset with clear communication and problem-solving skills Experience in working with SCRUM set up Fluent in English both spoken and written Bonus: knowledge of additional language(s) Ability to communicate, present and influence credibly at all levels both internally and externally Business Acumen & Commercial Awareness Working Conditions With or without reasonable accommodation, requires the physical and mental capacity to effectively perform all essential functions; Regular computer usage. Occasional reaching and lifting of small objects and operating office equipment. Frequent sitting, standing, and/or walking. Travel, as required. Pinkerton is an equal opportunity employer to all applicants and positions without regard to race/ethnicity, color, national origin, ancestry, sex/gender, gender identity/expression, sexual orientation, marital/prenatal status, pregnancy/childbirth or related conditions, religion, creed, age, disability, genetic information, veteran status, or any protected status by local, state, federal or country-specific law. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

LTIMindtree QE team is seeking a highly experienced and dynamic Associate Automation Architects. Primary Skills Ability to work independently with client stakeholders Ability to understand problem statements and provide solution to client with consultative approach Expert in Java Selenium with experience of leading multiple projects and responsible for creating updating and maintaining the automation framework Experience in APIwebservices testing and automating API testing Exposure to other automation tools Appium Cypress Selenium Jmeter etc will be helpful Effective communication skill is mandatory Experience on Agile methodology is mandatory Experience in client handling offshoreOnsite model of working Experience in creating PoCs and providing demo to client Experience in working in strict timeline Experience in working devops model and CICD pipeline Strong experience in developing and implementing Test automation Strategy and Test Automation Plan Good experience in leadingmanaging CRM Functional Testers and Automation Testers Good experience in reviewing Test Automation Scripts on Salesforce Application Good experience in API Testing Mobile Testing and Interface Testing Good experience in Azure JIRAALM Test Management Tools Experience working in AgileScrum projects and Insprint automation Experience in successfully implementingquality guidelines coding standards and procedures Strong Automation experience in Selenium BDD Automation with CICD Testing Good experience on CRM integration with Ecommerce ERP ETL skills would be a plus Strong experience in Selenium BDD Framework with Azure DevOps Integration and Implementation Secondary skills Strong written verbal communication and Presentation skills Strong ability to work with client stakeholders Requirement Review and Work Effort Estimation Good written and spoken communication skills Good Interpersonal skills LTIMindtree is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, ethnicity, nationality, gender, gender-identity, gender expression, language, age, sexual orientation, religion, marital status, veteran status, socio-economic status, dis-ability or any other characteristic protected by applicable law. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is dedicated to providing exceptional staffing solutions with a focus on transforming workforce capabilities. Our mission is to connect talented professionals with innovative organizations, ensuring mutual success through strategic partnerships. We value integrity, dedication, and excellence in service, fostering a culture of collaboration and growth. As a leading solutions provider in the HR realm, we aim to empower our clients by optimizing their human resource functionality with skilled personnel. Role Responsibilities Develop and maintain dashboards and reports in Tableau. Collaborate with stakeholders to understand data requirements and business objectives. Extract, transform, and load data from various sources into Tableau. Design and implement data visualizations that convey actionable insights. Ensure data accuracy and consistency in visualizations. Optimize Tableau performance to enhance user experience. Participate in data gathering and requirement analysis. Provide training and support for end users on Tableau functionalities. Document processes and procedures related to data management. Analyze and interpret complex datasets to identify trends and opportunities. Work closely with cross-functional teams to drive business intelligence initiatives. Stay updated with the latest trends in data visualization and Tableau capabilities. Troubleshoot and resolve issues related to Tableau workbooks and data sources. Assist in data preparation and validation tasks. Participate in project meetings and provide status updates on work progress. Qualifications Bachelor's degree in Computer Science, Information Technology, or related field. Proven experience as a Tableau Developer or similar role. Strong understanding of data visualization best practices. Experience with SQL for data extraction and reporting. Proficient in data modeling and ETL processes. Ability to analyze data and provide actionable insights. Strong problem-solving and analytical skills. Excellent communication skills for stakeholder interaction. Ability to work effectively in a team-oriented environment. Familiarity with additional BI tools is a plus. Detail-oriented with a focus on data accuracy. Knowledge of dashboard design principles and techniques. Willingness to stay updated on industry trends and technologies. Ability to manage multiple projects simultaneously. Prior experience in a fast-paced work environment. Strong organizational and time management skills. Skills: time management,teamwork,etl,tableau,communication skills,data accuracy,data analysis,bi tools,communication,organizational skills,problem-solving,digital : bi data visualization - tableau,sql,dashboard design,data modeling,team collaboration,data visualization,sql proficiency,business intelligence Show more Show less

Posted 4 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Your Future Evolves Here Evolent Health has a bold mission to change the health of the nation by changing the way health care is delivered. Our pursuit of this mission is the driving force that brings us to work each day. We believe in embracing new ideas, challenging ourselves and failing forward. We respect and celebrate individual talents and team wins. We have fun while working hard and Evolenteers often make a difference working in everything from scrubs to jeans. Are we growing? Absolutely and Globally. In 2021 we grew our teams by almost 50% and continue to grow even more in 2022. Are we recognized as a company you are supported by for your career and growth, and a great place to work? Definitely. Evolent Health International (Pune, India) has been certified as “Great Places to Work” in 2021. In 2020 and 2021 Evolent in the U.S. was both named Best Company for Women to Advance list by Parity.org and earned a perfect score on the Human Rights Campaign (HRC) Foundation’s Corporate Equality Index (CEI). This index is the nation's foremost benchmarking survey and report measuring corporate policies and practices related to LGBTQ+ workplace equality. We recognize employees that live our values, give back to our communities each year, and are champions for bringing our whole selves to work each day. If you’re looking for a place where your work can be personally and professionally rewarding, don’t just join a company with a mission. Join a mission with a company behind it. What You’ll Be Doing: Job Description Who You'll Be Working With Evolent is seeking a skilled and experienced SQL Server DBA to design, develop, and maintain large-scale data processing systems and databases. We're searching for a skilled individual with experience in both SQL Server database administration and cloud platform management. You'll possess a robust understanding of database administration, data modelling, data pipeline development, and optimisation techniques. The candidate will work closely with our enterprise data teams and other IT professionals to implement and optimise database and data processing solutions that meet business needs. What You’ll Be Doing: Collaborating and Working with the enterprise data teams and other IT professionals to develop, administer and support various data platforms including cloud-based and on-premises systems. Designing, developing, and maintaining SQL Server databases, including Install, configure, upgrade, maintain of Production and lower Database Environments Creating and managing database objects, such as tables, stored procedures, functions, and views. Handson experience on PowerShell Scripting Developing and implementing data pipelines, data models, and data warehouses using SQL Server Integration Services (SSIS) and other cloud tools like azure data factory. Monitoring and troubleshooting database and data processing issues and implementing high availability and disaster recovery procedures. Optimising database and data processing performance using techniques such as indexing, partitioning, and query optimisation. Ensuring data security and compliance with database and data policies and procedures. Provide support 24x7x365 for any incidents impacting application availability within the production environments. Implementing best practices for database and data processing design, development, and management. Providing technical guidance and support to other IT professionals and stakeholders. Staying up to date with the latest technologies and trends in database and data processing. Other duties as assigned. The Experience We Prefer: A bachelor’s degree in computer science, Information Technology, or a related field. At least 10-15 years of experience in SQL Server DBA and Cutting-edge Database technologies. Strong knowledge of database administration, data modelling, and data pipeline development. At least 10-15 years of experience with SQL Server technologies and tools, including T-SQL, SSIS, and SSAS. Strong knowledge of data warehousing, ETL processes, and big data technologies. Knowledge of cloud platforms such as Azure or AWS. Ability to write efficient SQL queries and optimise data processing performance. Basic knowledge of data visualisation and reporting tools such as Power BI, Tableau, or QlikView. Handon experience on Windows and Linux platforms Production mindset with concern for performance, availability, and managed change. Basic knowledge of data security and compliance regulations such as GDPR, HIPAA, or PCI DSS. Strong problem-solving skills, with the ability to analyse complex data problems and develop effective solutions. Good communication and collaboration skills, with the ability to work effectively with technical and non-technical stakeholders. Work with US based team to Complete Essential Functions and Projects Additional Roles and Functions to Develop and Acquire Over time as Team Engages on New Endeavors to Best Align to Meet Company’s Overall Needs and Goals Adjust focus of work and pivot as needed to best meet goals for continued success and growth of the organisation. Inquire with new business units and/or other opportunities within the organisation to assist in operationalizing and/or optimizing tasks based upon organisational needs. Academic Qualification A bachelor’s degree in computer science, Information Technology, or a related field. Mandatory Requirements: Employees must have a high-speed broadband internet connection with a minimum speed of 50 Mbps and the ability to set up a wired connection to their home network to ensure effective remote work. These requirements may be updated as needed by the business. Evolent Health is an equal opportunity employer and considers all qualified applicants equally without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, or disability status . Show more Show less

Posted 4 days ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic HR consultancy dedicated to connecting talent with opportunity. Our mission is to enhance workforce efficiency and support organizations in achieving their goals through strategic recruitment and talent management. Our team values integrity, collaboration, and innovation, and we work diligently to align the right talent with the right role, ensuring a great fit both for clients and candidates. Role Responsibilities Develop and manage data pipelines on the Databricks platform. Optimize and maintain data processing workflows using Apache Spark. Implement ETL processes to integrate data from various sources. Collaborate with data engineers and analysts to design data models. Write optimized SQL queries for data retrieval and analysis. Utilize Python for scripting and automation tasks. Monitor and troubleshoot data processing jobs for performance issues. Work with cloud technologies (Azure/AWS) to enhance data solutions. Conduct data analytics to derive actionable insights. Implement version control mechanisms for code management. Participate in code reviews and contribute to documentation. Stay updated with the latest features and best practices of Databricks. Provide technical support and guidance to team members. Engage in collaborative projects to enhance data quality. Participate in strategy meetings to align data projects with business goals. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 2+ years of experience in data engineering or development roles. Strong proficiency in the Databricks platform. Experience with Apache Spark and its components. Solid understanding of database management systems and SQL. Knowledge of Python for data manipulation and analytics. Familiarity with ETL tools and data integration techniques. Experience with cloud platforms such as AWS or Azure. Ability to work collaboratively in cross-functional teams. Excellent problem-solving skills and attention to detail. Strong communication skills, both verbal and written. Prior experience in data analysis and visualization is a plus. Understanding of data governance and security best practices. A proactive approach to learning new technologies. Experience in using version control software like Git. Skills: python scripting,databricks,version control (git),sql,cloud technologies,data governance and security,digital : databricks,cloud technologies (azure/aws),etl,apache spark,python,version control,data analytics,data integration Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Description Decision Point Analytics develops analytics & big data solutions for CPG, Retail & Consumer-focused industries, working with global fortune 500 clients. The company provides analytical insights & solutions to develop sales & marketing strategies in the Retail & CPG Industry by leveraging diverse sources of data. Decision Point was founded by Ravi Shankar and his classmates from IIT Madras with diverse experience across the CPG & Marketing Analytics domain. Role Description This is a full-time Senior Data Engineer role at Decision Point Analytics located in Chennai. The Senior Data Engineer will be responsible for data engineering, data modeling, ETL processes, data warehousing, and data analytics on-site. Qualifications Data Engineering and Data Modeling skills Experience in ETL (Extract Transform Load) processes Data Warehousing expertise Data Analytics proficiency Strong problem-solving and analytical skills Proficient in database technologies like SQL, NoSQL Experience with big data technologies like Hadoop, Spark Bachelor's or Master's degree in Computer Science, Information Technology, or related field Show more Show less

Posted 4 days ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

This is a Data Engineer/Data Stewardship position for the C360 Ingestion and Stewardship team in GDIA to manage customer data ingestion for C360 Core Platform to build customer knowledge, purpose-built data products and make data available for enterprise users to generate insights, product development and AI capabilities leveraging Enterprise Data Platform (Google Cloud Platform) tools and services. This position will play a key role to land the data from diverse internal and external source system applications to Enterprise data lake by ensuring data quality, privacy, security, compliance and collaborating with business, data governance and EDP teams. Skills Required: Python, SQL, Google Cloud Platform, ETL, Tekton Skills Preferred: LLM, AI Experience Required: 7+ years of relevant experience Show more Show less

Posted 4 days ago

Apply

10.0 - 15.0 years

22 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. As GCP Data Engineer at Kyndryl, you will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment using GCP data services. You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. Responsibilities: Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. Develop and maintain Python / PySpark for data processing and integrate with GCP services for seamless data operations. Develop and optimize SQL queries for data analysis and reporting. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Implement data governance and security best practices within GCP. Perform data quality checks and validation to ensure accuracy and consistency. Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Provide technical support and guidance to junior data engineers and other team members. Participate in code reviews and contribute to continuous improvement of data engineering practices. Implement best practices for cost management and resource utilization within GCP. If you're ready to embrace the power of data to transform our business and embark on an epic data adventure, then join us at Kyndryl. Together, let's redefine what's possible and unleash your potential. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Experience: Bachelor’s or master’s degree in computer science, Engineering, or a related field with over 8 years of experience in data engineering More than 3 years of experience with the GCP data ecosystem Hands-on experience and Strong proficiency in GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion. Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in PySpark and/or Python, specifically for building cloud-native data pipelines. Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc. Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. Knowledge of data governance, security, and compliance best practices. Experience with private and public cloud architectures, pros/cons, and migration considerations. Excellent problem-solving, analytical, and critical thinking skills. Ability to manage multiple projects simultaneously, while maintaining a high level of attention to detail. Communication Skills: Must be able to communicate with both technical and nontechnical. Able to derive technical requirements with the stakeholders. Ability to work independently and in agile teams. Preferred Technical And Professional Experience GCP Data Engineer Certification is highly preferred. Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. Experience working as a Data Engineer and/or in cloud modernization. Knowledge of Databricks, Snowflake, for data analytics. Experience in NoSQL databases Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Familiarity with BI dashboards and Google Data Studio is a plus. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 4 days ago

Apply

3.0 - 5.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

Remote

Linkedin logo

The Role : In this role, you will collaborate with Team Leader, Scrum Master, data analysts, and developers to build technology solutions for Morningstar platforms. You should have hands on experience on Core Java and Python. Also, you should have experience in component-based architectures and create scalable, flexible technical solutions. You would create new components, support existing systems, study their enterprise complexities and develop/implement better systems with modern software development practices. Developing good understanding of existing systems on other platforms and its database is a beginning step. Responsibilities : Design, develop, and maintain scalable ETL pipelines for data extraction, transformation, and loading. Write efficient SQL queries and stored procedures to manage and manipulate large datasets in SQL Server. Implement data validation and integrity checks to ensure accuracy across all pipelines. Collaborate with business stakeholders and business analysts to deliver data solutions that support business goals. Work with Python to automate data workflows and integrate with third-party systems. Optimize data pipelines for performance, scalability, and cost-efficiency. Troubleshoot, debug, and resolve issues related to data processing. Leverage AWS services for cloud-based ETL processes (e.g., S3, Lambda, Glue) and storage solutions. Collaborate in an agile environment with product managers, data analysts, and engineering teams. Must Have Skills: These are the most important skills, qualities, etc. that we’d like for this role. 3-5 years of experience in Data Engineering or related field. Proficiency in Python for building data pipelines and automation scripts. Strong experience with SQL Server including complex queries, stored procedures, and optimization techniques. Expertise in ETL processes and data modeling. Knowledge of version control and CI/CD pipelines for data projects. Good to have – Java, Spring and MongoDB Qualifications : Bachelor’s degree in computer science, Information Technology, or a related field. Strong analytical and problem-solving skills. Good communication and teamwork abilities. Morningstar’s hybrid work environment gives you the opportunity to work remotely and collaborate in-person each week. We’ve found that we’re at our best when we’re purposely together on a regular basis, at least three days each week. A range of other benefits are also available to enhance flexibility as needs change. No matter where you are, you’ll have tools and resources to engage meaningfully with your global colleagues. I10_MstarIndiaPvtLtd Morningstar India Private Ltd. (Delhi) Legal Entity Show more Show less

Posted 4 days ago

Apply

3.0 - 5.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Overview We are an integral part of Annalect Global and Omnicom Group, one of the largest media and advertising agency holding companies in the world. Omnicom’s branded networks and numerous specialty firms provide advertising, strategic media planning and buying, digital and interactive marketing, direct and promotional marketing, public relations, and other specialty communications services. Our agency brands are consistently recognized as being among the world’s creative best. Annalect India plays a key role for our group companies and global agencies by providing stellar products and services in areas of Creative Services, Technology, Marketing Science (data & analytics), Market Research, Business Support Services, Media Services, Consulting & Advisory Services. We are growing rapidly and looking for talented professionals like you to be part of this journey. Let us build this, together . Responsibilities Requirement gathering and evaluation of clients’ business situations in order to implement appropriate analytic solutions. Designs, generates and manages reporting frameworks that provide insight as to the performance of clients’ marketing activities across multiple channels. Be the single point of contact on anything data & analytics related to the project. QA process: Maintain, create and re-view QA plans for deliverables to align with the requirements, identify discrepancies if any and troubleshoot issues. Prioritize tasks and proactively manage workload to ensure timely delivery with high accuracy. Active contribution to project planning and scheduling. Create and maintain project specific documents such as process / quality / learning documents. Should be able to drive conversation with team, client and business stake holders Qualifications 6-9 years' experience in data management and analysis in Media or relevant domains with strong problem-solving ability. Mandatory to have Strong SQL skills with Good Exposure on ETL capabilites like Alteryx and Visualization skills in Tableau and storytelling capabilites. Prior knowledge of media/advertising domain would be beneficial Good to have Datorama API Capability (Not Mandate), Good Media domain Knowledge is good to have Prior experience in AWS(S3 and Redshift), GBQ is good to have. Exposure to other ETL tools is good to have. Strong knowledge on media metrics, custom calculations, and metrics co-relation Ability to identify and help determine key performance indicators for the clients. Strong written and verbal communication skills. Led delivery teams and projects to successful implementations Familiarity working with large data sets and creating cohesive stories. Able to work and lead successfully with teams, handling multiple projects and meeting timelines. Maintaining positive client and vendor relationships. Presentation skills using MS Power Point or any presentation platforms

Posted 4 days ago

Apply

2.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic HR consultancy dedicated to connecting talent with opportunity. Our mission is to enhance workforce efficiency and support organizations in achieving their goals through strategic recruitment and talent management. Our team values integrity, collaboration, and innovation, and we work diligently to align the right talent with the right role, ensuring a great fit both for clients and candidates. Role Responsibilities Develop and manage data pipelines on the Databricks platform. Optimize and maintain data processing workflows using Apache Spark. Implement ETL processes to integrate data from various sources. Collaborate with data engineers and analysts to design data models. Write optimized SQL queries for data retrieval and analysis. Utilize Python for scripting and automation tasks. Monitor and troubleshoot data processing jobs for performance issues. Work with cloud technologies (Azure/AWS) to enhance data solutions. Conduct data analytics to derive actionable insights. Implement version control mechanisms for code management. Participate in code reviews and contribute to documentation. Stay updated with the latest features and best practices of Databricks. Provide technical support and guidance to team members. Engage in collaborative projects to enhance data quality. Participate in strategy meetings to align data projects with business goals. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 2+ years of experience in data engineering or development roles. Strong proficiency in the Databricks platform. Experience with Apache Spark and its components. Solid understanding of database management systems and SQL. Knowledge of Python for data manipulation and analytics. Familiarity with ETL tools and data integration techniques. Experience with cloud platforms such as AWS or Azure. Ability to work collaboratively in cross-functional teams. Excellent problem-solving skills and attention to detail. Strong communication skills, both verbal and written. Prior experience in data analysis and visualization is a plus. Understanding of data governance and security best practices. A proactive approach to learning new technologies. Experience in using version control software like Git. Skills: python scripting,databricks,version control (git),sql,cloud technologies,data governance and security,digital : databricks,cloud technologies (azure/aws),etl,apache spark,python,version control,data analytics,data integration Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic HR consulting firm dedicated to optimizing human resource management and development. Our mission is to bridge the gap between talent and opportunity, driving growth and success for both our clients and candidates. We foster a culture of collaboration, innovation, and integrity, consistently striving to deliver exceptional service in the evolving landscape of human resources. Role Responsibilities Design, develop, and implement ETL processes using Talend. Collaborate with data analysts and stakeholders to understand data requirements. Perform data cleansing and transformation tasks. Optimize and automate existing data integration workflows. Monitor ETL jobs and troubleshoot issues as they arise. Conduct performance tuning of Talend jobs for efficiency. Document ETL processes and maintain technical documentation. Work closely with cross-functional teams to support data needs. Ensure data integrity and accuracy throughout the ETL process. Stay updated with Talend best practices and upcoming features. Assist in the migration of data from legacy systems to new platforms. Participate in code reviews to ensure code quality and adherence to standards. Engage in user training and support as necessary. Provide post-implementation support for deployed solutions. Evaluate and implement new data tools and technologies. Qualifications 3+ years of experience as a Talend Developer. Strong understanding of ETL principles and practices. Proficiency in Talend Open Studio. Hands-on experience with SQL and database management. Familiarity with data warehousing concepts. Experience using Java for Talend scripting. Knowledge of APIs and web services. Effective problem-solving skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and accuracy in data handling. Experience with job scheduling tools. Ability to manage multiple priorities and deadlines. Knowledge of data modeling concepts. Experience in documentation and process mapping. Skills: data cleansing,data warehousing,job scheduling tools,problem solving,team collaboration,sql,documentation,digital : talend open studio,talend,data transformation,data modeling,performance tuning,web services,api development,java,apis,data integration,etl processes Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading recruitment agency specializing in connecting top talent with premier companies across India. Our mission is to facilitate meaningful career opportunities while maintaining a commitment to integrity, excellence, and client satisfaction. Our culture promotes collaboration and innovation, ensuring both clients and candidates receive the highest level of service. Job Title: Ab Initio Developer Location: India (On-Site) Role Responsibilities Design and develop ETL processes using Ab Initio. Create and manage data workflows to ensure data accuracy and quality. Collaborate with data architects and analysts to gather requirements. Optimize existing ETL processes for performance improvements. Perform debugging and troubleshooting of ETL jobs and workflows. Implement data cleansing and transformation processes. Monitor and maintain ETL systems and troubleshoot issues as they arise. Generate documentation for all ETL processes and workflows. Assist in data migration projects and data integration efforts. Work closely with business stakeholders to identify data needs. Participate in code reviews and ensure best practices are followed. Update and maintain metadata repositories as required. Provide training and support to junior developers and team members. Develop and execute unit test cases to validate ETL processes. Stay updated with the latest Ab Initio features and enhancements. Qualifications Bachelor’s degree in Computer Science, IT, or related field. 3+ years of experience in Ab Initio development. Strong knowledge of ETL processes and data warehousing concepts. Proficient in SQL for data manipulation and querying. Experience with Unix/Linux operating systems. Familiarity with data modeling concepts and practices. Ability to work in a fast-paced, collaborative environment. Strong analytical and problem-solving skills. Excellent verbal and written communication skills. Experience with performance tuning of ETL jobs. Ability to handle multiple projects simultaneously. Keen attention to detail. Experience with version control tools such as Git. Knowledge of data governance and security practices. Ability to work independently with minimal supervision. Willingness to learn new technologies and frameworks. We invite passionate and skilled Ab Initio Developers to join us at Viraaj HR Solutions. This on-site role in India offers an exciting opportunity to work on innovative projects and contribute to impactful data solutions. Skills: data modeling,version control (git),data governance,performance tuning,sql,troubleshooting,data warehousing,etl,data processing,analytical thinking,team collaboration,data analysis,data workflows,ab initio,etl processes,unix/linux Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a leading recruitment consultancy focused on connecting businesses with top talent across various industries. Our mission is to deliver exceptional HR solutions tailored to the unique needs of our clients, contributing to their success through strategic hiring practices. We value integrity, commitment, and excellence in our work culture, ensuring a supportive environment for both our clients and candidates. Role Responsibilities Design and implement robust data pipelines using Python and Pyspark. Develop and maintain data models that support organizational analytics and reporting. Work closely with data scientists and analysts to understand data requirements and translate them into technical specifications. Integrate and maintain Snowflake for data warehousing solutions. Ensure data quality and integrity through effective ETL processes. Conduct data profiling and performance tuning to optimize system performance. Collaborate with cross-functional teams to define data architecture standards and best practices. Participate in the creation of documentation for data flows and data management best practices. Monitor data pipelines and troubleshoot issues as they arise. Implement security measures to protect sensitive data information. Stay updated with the latest trends and technologies in data engineering. Assist in migrating existing data solutions to cloud-based infrastructures. Support continuous improvement initiatives around data management. Provide technical guidance and mentorship to junior data engineers. Participate in code reviews and adhere to best practices in software development. Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 3 years of experience in data engineering or a related role. Proficient in Python programming and Pyspark framework. Experience with Snowflake or similar cloud data warehousing platforms. Strong understanding of ETL principles and data integration techniques. Solid understanding of database design and data modeling concepts. Excellent SQL skills for querying databases and data analysis. Familiarity with cloud platforms like AWS, Azure, or Google Cloud. Ability to work collaboratively in cross-functional teams. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Experience with version control systems (e.g., Git). Knowledge of Agile methodologies and project management. A commitment to continuous learning and professional development. Ability to work on multiple projects simultaneously and meet deadlines. Skills: data architecture,etl,git,problem-solving skills,snowflake,python,data engineering,data warehousing,cloud computing,data integration,sql,data modeling,sql proficiency,pyspark,agile methodologies,cloud platforms (aws, azure, google cloud) Show more Show less

Posted 4 days ago

Apply

6.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

Job Summary: We are looking for a skilled and motivated Software Engineer with strong experience in data engineering and ETL processes. The ideal candidate should be comfortable working with any object-oriented programming language, possess strong SQL skills, and have hands-on experience with AWS services like S3 and Redshift. Experience in Ruby and working knowledge of Linux are a plus. Key Responsibilities: Design, build, and maintain robust ETL pipelines to handle large volumes of data. Work closely with cross-functional teams to gather data requirements and deliver scalable solutions. Write clean, maintainable, and efficient code using object-oriented programming and SOLID principles. Optimize SQL queries and data models for performance and reliability. Use AWS services (S3, Redshift, etc.) to develop and deploy data solutions. Troubleshoot issues in data pipelines and perform root cause analysis. Collaborate with DevOps/infra teams for deployment, monitoring, and scaling data jobs. Required Skills: 6+ years of experience in Data Engineering. Programming : Proficiency in any object-oriented language (e.g., Java, Python, etc.) Bonus : Experience in Ruby is a big plus. SQL : Moderate to advanced skills in writing complex queries and handling data transformations. AWS : Must have hands-on experience with services like S3 and Redshift . Linux : Familiarity with Linux-based systems is good to have. Preferred Qualifications: Experience working in a data/ETL-focused role. Familiarity with version control systems like Git. Understanding of data warehouse concepts and performance tuning. Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Title: Fullstack Developer (Java + Angular) Experience Required: 5+ Years Location: Gandhinagar / Ahmedabad / Pune Joining: Immediate Joiners Preferred Job Description: We are seeking experienced and passionate Fullstack Developers with a strong foundation in Java and Angular to join our growing team. The ideal candidate will be proficient in backend and frontend technologies and capable of delivering high-quality software solutions in a collaborative environment. Key Responsibilities: Develop, test, and maintain web-based applications using Java (Spring Boot) and Angular. Implement RESTful APIs and microservices using Spring MVC/Webservices. Build responsive UI components using Angular 8+ / React 16+, Angular Material, and Bootstrap. Work on backend logic, ORM using Hibernate, and reporting tools like Jasper Reports. Write efficient SQL and PL/SQL scripts for Oracle databases. Utilize Pentaho Kettle for ETL jobs. Version control and collaboration using GIT. Apply design patterns and development best practices. Perform basic Linux scripting and server-side troubleshooting. Collaborate with cross-functional teams to define, design, and ship new features. Required Skills & Experience: Bachelor's degree in Computer Science or a related field. 5+ years of hands-on experience in: Java 8+, Spring Boot, Spring MVC, Hibernate, Web Services Angular 8+ or React 16+, HTML5, CSS3, SCSS, Bootstrap Oracle SQL, PL/SQL Pentaho Kettle (ETL) Jasper Reports GIT, Design Patterns Basic Linux scripting & troubleshooting If Intrested. Please submit your CV to Khushboo@Sourcebae.com or share it via WhatsApp at 8827565832 Stay updated with our latest job opportunities and company news by following us on LinkedIn: :https://www.linkedin.com/company/sourcebae Show more Show less

Posted 4 days ago

Apply

3.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary We are looking for a talented and motivated Data Engineer with 3 to 6 years of experience to join our data engineering team. The ideal candidate will have strong SQL skills, hands-on experience with Snowflake, ETL tools like Talend, DBT for transformation workflows, and a solid foundation in AWS cloud services. Key Responsibilities Design, build, and maintain efficient and reliable data pipelines using SQL, Talend, and DBT Develop and optimize complex SQL queries for data extraction and transformation Manage and administer Snowflake data warehouse environments Collaborate with analytics, product, and engineering teams to understand data requirements Implement scalable data solutions on AWS (e.g., S3, Lambda, Glue, Redshift, EC2) Monitor and troubleshoot data workflows and ensure data quality and accuracy Support deployment and version control of data models and transformations Write clear documentation and contribute to best practices Required Skills And Qualifications 3 to 6 years of experience in data engineering or related fields Strong expertise in SQL and performance tuning of queries Hands-on experience with Snowflake (data modeling, security, performance tuning) Proficiency with Talend for ETL development Experience with DBT (Data Build Tool) for transformation workflows Good knowledge of AWS services, especially data-centric services (S3, Lambda, Glue, etc.) Familiarity with Git-based version control and CI/CD practices Strong analytical and problem-solving skills Show more Show less

Posted 4 days ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies