Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About The Role We're seeking an experienced Infrastructure Engineer to join our platform team, handling massive-scale data processing and analytics infrastructure that supports over 5B+ events and more 5M+ DAU .We’re looking for someone who can help us scale gracefully while optimizing for performance, cost, and resiliency. Key Responsibilities Design, implement, and manage our AWS infrastructure, with a strong emphasis on automation, resiliency, and cost-efficiency. Develop and oversee scalable data pipelines (for event processing, transformation, and delivery). Implement and manage stream processing frameworks (such as Kinesis, Kafka, or MSK). Handle orchestration and ETL workloads, employing services like AWS Glue, Athena, Databricket, Redshift, or Apache Airflow. Implement robust network, storage, and backup strategies for growing workloads. Monitor, debug, and resolve production issues related to data and infrastructure in real time. Implement IAM controls, logging, alerts, and Security Best Practices across all components. Provide deployment automation (Docker, Terraform, CloudFormation) and collaborate with application engineers to enable smooth delivery. Build SOP for support and setup a functioning 24*7 support system (including hiring right engineers) to ensure system uptime and availability Required Technical Skills 5+ years of experience with AWS services (VPC, EC2, S3, Security Groups, RDS, Kinesis, MSK, Redshift, Glue). Experience designing and managing large-scale data pipelines with high-throughput workloads. Ability to handle 5 billion events/day and 1M+ concurrent users’ workloads gracefully. Familiar with scripting (Python, Terraform) and automation practices (Infrastructure as Code). Familiar with network fundamentals, Linux, scaling strategies, and backup routines. Collaborative team player — able to work with engineers, data analysts, and stakeholders. Preferred Tools & Technologies AWS: EC2, S3, VPC, Security Groups, RDS, Redshift, DocumentDB, MSK, Glue, Athena, CloudWatch Infrastructure as Code: Terraform, CloudFormation Scripted automation: Python, Bash Container orchestration: Docker, ECS or EKS Workflow orchestration: Apache Airflow, Dagster Streaming framework: Apache Kafka, Kinesis, Flink Other: Linux, Git, Security best practices (IAM, Security Groups, ACM) Education Bachelor's/Master's degree in Computer Science, Data Science, or related field Relevant professional certifications in cloud platforms or data technologies Why Join Us? Opportunity to work in a fast-growing audio and content platform. Exposure to multi-language marketing and global user base strategies. A collaborative work environment with a data-driven and innovative approach. Competitive salary and growth opportunities in marketing and growth strategy. Success Metrics ✅ Scalability: Ability to handle 1+ billion events/day with low latency and high resiliency. ✅ Cost-efficiency: Reduction in AWS operational costs by optimizing services, storage, and data transfer. ✅ Uptime/SLI: Achieve 99.9999% platform and pipeline uptimes with automated fallback mechanisms. ✅ Data delivery latency: Reduce event delivery latency to under 5 minutes for real-time processing. ✅ Security and compliance: Implement controls to pass PCI-DSS or SOC 2 audits with zero major findings. ✅ Developer productivity: Improve team delivery speed by self-service IaC modules and automated routines. About KUKU Founded in 2018, KUKU is India’s leading storytelling platform, offering a vast digital library of audio stories, short courses, and microdramas. KUKU aims to be India’s largest cultural exporter of stories, culture and history to the world with a firm belief in “Create In India, Create For The World”. We deliver immersive entertainment and education through our OTT platforms: Kuku FM, Guru, Kuku TV, and more. With a mission to provide high-quality, personalized stories across genres from entertainment across multiple formats and languages, KUKU continues to push boundaries and redefine India’s entertainment industry. 🌐 Website: www.kukufm.com 📱 Android App: Google Play 📱 iOS App: App Store 🔗 LinkedIn: KUKU 📢 Ready to make an impact? Apply now Skills: aws services,bash,networking,kafka,data pipeline,docker,kinesis,data pipelines,etl,terraform,automation,aws,security,ec2,cloudformation,cloud,scripting,linux,infrastructure,amazon redshift,python,vpc,network fundamentals,workflow orchestration,stream processing frameworks,container orchestration,dagster,airflow,s3 Show more Show less
Posted 1 day ago
3.0 - 6.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About The Role Grade Level (for internal use): 09 The Role: As a Software Developer with the Data & Research Development team, you will be responsible for developing & providing backend support across a variety of products within the Market Intelligence platform. Together, you will build scalable and robust solutions using AGILE development methodologies with a focus on high availability to end users. The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Deliver solutions within a multi-functional Agile team Develop expertise in our proprietary enterprise software products Set and maintain a level of excitement in using various technologies to develop, support, and iteratively deploy real enterprise level software Achieve an understanding of customer environments and their use of the products Build solutions architecture, algorithms, and designs for solutions that scale to the customer's enterprise/global requirements Apply software engineering practices and implement automation across all elements of solution delivery Basic Qualifications What we’re looking for: 3-6 years of desktop application development experience with deep understanding of Design Patterns & Object-oriented programming. Hands on development experience using C#, .Net 4.0/4.5, WPF, Asp.net, SQL server. Strong OOP and Service Oriented Architecture (SOA) knowledge. Strong understanding of cloud applications (Containers, Dockers etc.) and exposure to data ETL will be a plus. Ability to resolve serious performance related issues through various techniques, including testing, debugging and profiling. Strong problem solving, analytical and communication skills. Possess a true “roll up the sleeves and get it done” working approach; demonstrated success as a problem solver, operating as a client-focused self-starter. Preferred Qualifications Bachelor's degree in computer science or computer engineering About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 313152 Posted On: 2025-05-05 Location: Hyderabad, Telangana, India Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
The ideal candidate will be responsible for designing, developing, and deploying scalable ETL processes using Informatica PowerCenter to support our data warehousing and analytics initiatives. You will collaborate with business and technical stakeholders to ensure high data quality, availability, and performance. Key Responsibilities: Design, develop, and maintain ETL workflows and mappings using Informatica PowerCenter or Informatica Intelligent Cloud Services (IICS). Extract, transform, and load data from various source systems (e.g., SQL Server, Oracle, flat files, cloud APIs) into data warehouses or operational data stores. Optimize ETL performance, conduct tuning, and ensure error handling and logging. Collaborate with data architects and analysts to understand data requirements and deliver high-quality data solutions. Work with QA teams to support data validation and testing efforts. Support data integration, migration, and transformation initiatives. Document ETL processes, data flows, and job schedules. Monitor daily ETL jobs and resolve production issues in a timely manner. Requirements Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent work experience). 3+ years of experience with Informatica PowerCenter or Informatica IICS. Strong SQL skills and experience with relational databases (e.g., Oracle, SQL Server, PostgreSQL). Solid understanding of data warehousing concepts and dimensional modeling. Experience in performance tuning and troubleshooting ETL processes. Hands-on experience with job scheduling tools (e.g., Autosys, Control-M, Tidal). Familiarity with version control systems and DevOps practices. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, AWS Redshift, Azure Synapse). Exposure to data governance and data quality tools. Knowledge of scripting languages (e.g., Shell, Python). Experience working in Agile/Scrum environments. Familiarity with BI tools (e.g., Tableau, Power BI) is a plus. Benefits This position comes with competitive compensation and benefits package: Competitive salary and performance-based bonuses Comprehensive benefits package Home Office model Career development and training opportunities Flexible work arrangements (remote and/or office-based) Dynamic and inclusive work culture within a globally known group Private Health Insurance Pension Plan Paid Time Off Training & Development *Note: Benefits differ based on employee level Show more Show less
Posted 1 day ago
1.0 - 2.6 years
0 Lacs
Hyderabad, Telangana, India
Remote
Summary Position Summary ServiceNow Configurator/Developer (Analyst) – Deloitte Support Services India Private Limited Solutions Delivery-Canada is an integral part of the Information Technology Services group. The principle focus of this organization is the development and maintenance of technology solutions that e-enable the delivery of Function and Marketplace Services and Management Information Systems. Solutions Delivery Canada develops and maintains solutions built on varied technologies like SalesForce, Microsoft technologies, SAP, Hadoop, ETL, BI , ServiceNow, PowerAutomate, OpenText. Solutions Delivery Canada has various groups which provide the best of the breed solutions to the clients by following a streamlined system development methodology. Solutions Delivery Canada comprises of groups like Usability, Application Architecture, Development and Quality Assurance and Performance. Work you’ll do Create, configure, and customize ServiceNow applications for new and existing implementations. Create and configure functional data such as Notifications and Service Level Agreements. Create and configure script objects such as Business Rules, Script Includes, UI Policies and Actions, Client Scripts, ACLs. Set-up interfaces between ServiceNow and other platforms in line with integration opportunities identified by Solution Architects. Perform system and integration testing. Recommend Administration settings and best practices. Create documentation of the developments, unit test cases and implementation plans. Work effectively in diverse teams within an inclusive team culture where people are recognized for their contribution Responsibilities Strategic Strong technical skills regarding technical topics and remote collaboration skills are critical to this role. Demonstrates an ability to deliver on project commitments. Produces work that consistently meets quality standards. Must have hands on experience in ITSM & ITBM module of ServiceNow. Must have knowledge of UI builder and Workspace configuration in ServiceNow. Should have hands-on experience in Business Rules, Script Include, ACLs, and all server-side scripting in best practice. Knowledge of Domain separation in ServiceNow is add-on. Operational Design, Development and Implementation of ServiceNow customization including, but not limited to core setup, workflow administration, reporting, data imports, custom scripting, and third-party software integrations. Should have a good understanding of Agile/SAFe Methodologies. Perform advanced customizations including Business Rules, UI Pages, UI Macros, UI Scripts, Script Includes, Client Scripts, workflows, custom tables, reports etc. Perform workflow design, configuration, development, and data loads for ServiceNow platform, (ServiceNow) applications. Responsible for programming workflow, enhancements, and integrations with ServiceNow platform applications. Should have REST/SOAP Web Services integration experience. Good to have knowledge on following ServiceNow applications. Discovery – On-Premises & Off- Premises ServiceNow Orchestration ITOM - IT Operations Management SPM/ITBM – IT Business Management HRSD (HR Service Delivery Fundamentals) ServiceNow Event Management Integration and ServiceNow Scripting (Glide, (JavaScript, Ajax, XML, JSON etc. HTML and CSS)) Maintain pace with ServiceNow versioning. Perform upgrades and customizations of ServiceNow platform applications based on guidance from project manager, architects, ITIL practice leads and customers. Maintain and adhere to source code, configuration management, release management and software development best practices. Develop training materials and provide end-user or IT technician training on using the ServiceNow functionality. Provide in-person support daily to customer and team This will include direct interaction with the Executive staff and other key management. Maintain ServiceNow training and knowledge thru self-learning, attending conferences and training Responsible for proactive problem and risk management Triage and fix defects found in ServiceNow platform, applications, and workflows. Defining and validating non-functional (technical) requirements and establishing traceability between requirements and application architecture/design. End-to-end ownership of Solutioning for current & new opportunities (starting from requirement analysis to proposal delivery). Working with SMEs, Leads, Managers, Resources & Project/Delivery Manager (in case of specific inputs for solution) on finalizing the solution and estimates. Work with Project/delivery managers to build POC (proof of concept), prototype and sample development. Work with project/delivery managers to devise the timeline/schedule for executing the project. Working as a bridge between the Client & Delivery team during the transition of the won opportunities. And supporting delivery team in initial stages of the Discovery Phase, including discovery agenda finalization, facilitation material preparations, dry runs and actual engagement. Timely & quality delivery of opportunities Should have good understanding and should be up to date on ServiceNow latest releases, features and issues. Should be always align to the best practices and thrive towards innovative solution. Should have niche understanding of the ITIL processes and should be able to relate with the stakeholder requirements. Experience: 1-2.6 Years Work location: Hyderabad Shift Timings: 11- 8 pm Key Technical Skills, Experience and Knowledge At least 2-4 years of ServiceNow experience, including custom development, configuration. ServiceNow scripting experience using JavaScript, HTML, CSS, XML and REST/SOAP Web Services. Understanding and experience of Business Rules, Script Includes, UI Actions and all scripted aspects of ServiceNow. Customize ServiceNow UI and Service Portal through use of UI pages, CMS, CSS and Service Portal widgets. Strong knowledge of integrations and migrations. Deep understanding of ITIL. Strong understanding of ServiceNow administration settings. Deep functional and technical knowledge of the ServiceNow platform as well as experience delivering medium to large-scale ServiceNow implementations Performs well in an agile environment with constant feedback and interaction with the team. Ability to accurately estimate level of effort/duration on projects and tasks. A positive attitude and perseverance required to troubleshoot/resolve complex technical issues whilst balancing multiple priorities. Demonstrated ability to troubleshoot technical issues. Strong knowledge in application development life cycle Executes design activities leveraging knowledge of all application design techniques; Ensures design is consistent with solution architecture; Ensures adherence to design standards; Performs technology proofs-of-concept to support design approaches Execute construction of solution that leverages knowledge of designated programming language(s) and ensures consistency with proposed design approach; Initiates peer reviews of system code; Establishes standards and leading practices Experience working with geographically distributed and culturally diverse work groups Strong written and verbal communication skills with the ability to present to IT and business leaders Demonstrated ability to stay current with development best practices, existing and emerging technology platforms, and industry trends Experience with formal software development methodologies, with a focus on Agile Certifications ServiceNow Certified Systems Administrator is a must. ServiceNow Implementation Specialist & CAD is a great bonus. Essential Competencies High degree of technical expertise in relevant areas Team Orientation and Team lead Motivated team player willing to learn from others Analytical, logical, thorough and methodical Problem management skills Able to work without supervision using their initiative to be creative in solution design Excellent interpersonal manner, communication skills & customer focussed Education/Other: Bachelor’s Degree Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 302821 Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Qualcomm India Private Limited Job Area Information Technology Group, Information Technology Group > Systems Analysis General Summary We are seeking a Systems Analyst,Senior to join our growing organization with specialized skills in IBM Planning Analytics/TM1 and functional understanding of Finance budgeting and forecasting. This role involves advanced development, troubleshooting, and implementation of TM1 solutions to meet complex business requirements. The person will be part of Finance Planning and reporting team and will primarily work closely with his/her manager and will be helping in delivering TM1 planning and budgeting roadmap for the global stakeholders. Key Responsibilities Able to design and develop IBM Planning Analytics(TM1) solutions as per standards. Able to write logical, complex, concise, efficient, and well-documented code for both TM1 rules and Turbo Integrator processes. Good to have knowledge of Python and TM1py libraries. Able to write business requirement specifications, define level of efforts for Projects/Enhancements and should design and coordinate system tests to ensure solutions meet business requirements SQL skills to be able to work with source data and understand source data structures. Good understanding of the SQL and ability to write complex queries. Understanding cloud technologies especially AWS and Databricks will be an added advantage. Experience in client reporting and dashboard tools like Tableau, PA Web,PAFE. Understanding of ETL processes and data manipulation Working independently with little supervision Taking responsibility for own work and making decisions that are moderate in impact; errors may have financial impact or effect on projects, operations, or customer relationships; errors may require involvement beyond immediate work group to correct. Should provide ongoing system support, including troubleshooting and resolving issues to ensure optimal system performance and reliability Using verbal and written communication skills to convey information that may be complex to others who may have limited knowledge of the subject in question Using deductive and inductive problem solving; multiple approaches may be taken/necessary to solve the problem; often information is missing or incomplete; intermediate data analysis/interpretation skills may be required. Exercising substantial creativity to innovate new processes, procedures, or work products within guidelines or to achieve established objectives. Minimum Qualifications 3+ years of IT-relevant work experience with a Bachelor's degree. OR 5+ years of IT-relevant work experience without a Bachelor’s degree. Qualifications The ideal candidate will have 8-10 years of experience in designing, modeling, and developing enterprise performance management (EPM) applications using IBM Planning Analytics (TM1). Able to design and develop IBM Planning Analytics(TM1) solutions as per standards. Able to write logical, complex, concise, efficient, and well-documented code for both TM1 rules and Turbo Integrator processes. Lead the design, modeling, and development of TM1 applications, including TI scripting, MDX, rules, feeders, and performance tuning. Should able to provide technical expertise in identifying, evaluating, and developing systems and procedures that are efficient, cost effective and meet user requirements. Plans and executes unit, integration and acceptance testing Must be a good team player who can work seamlessly with Global teams and Data teams Excellent communication and collaboration skills to work with business stakeholders Having functional understanding of Finance budgeting and forecasting Understanding cloud technologies especially AWS and Databricks will be an added advantage Experience in Agile methodologies and JIRA user stories Able to design and develop solutions using python as per standards we are seeking a Systems Analyst,Senior to join our growing organization with specialized skills in IBM Planning Analytics/TM1 and functional understanding of Finance budgeting and forecasting. The person will be part of Finance Planning and reporting te Required bachelor’s or master’s degree in information science, computer science, business, or equivalent work experience. Applicants : Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies : Our Careers Site is only for individuals seeking a job at Qualcomm. Staffing and recruiting agencies and individuals being represented by an agency are not authorized to use this site or to submit profiles, applications or resumes, and any such submissions will be considered unsolicited. Qualcomm does not accept unsolicited resumes or applications from agencies. Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers. 3076094 Show more Show less
Posted 1 day ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Our Client is one of the United States' largest insurers, providing a wide range of insurance and financial services products with gross written premiums well over US$25 Billion (P&C). They proudly serve more than 10 million U.S. households with more than 19 million individual policies across all 50 states through the efforts of over 48,000 exclusive and independent agents and nearly 18,500 employees. Finally, our client is part of one of the largest Insurance Groups in the world. Role Overview The purpose of this role is to ensure smooth operations of our production data assets. Activities will include monitoring production systems for incident occurrence, alerting applicable parties when incidents arise and incident triaging and management. They will also carry out activities to prevent production incidents. The Data Production Support Analyst plays a crucial role in ensuring the smooth operation of our production data assets and overall operational efficiency. They ensure the reliability and accuracy of our data production processes. This role requires a blend of technical expertise, data acumen, problem-solving skills, the ability to work under pressure and the ability to work collaboratively with various teams. Responsibilities Works with off-shore Application Operations team Administers, analyzes, and prioritizes systems issues and negotiates a course of action for resolution Supports work flow and solutions; trouble shoots user errors and supports reporting capabilities Utilizes system monitoring utilities to monitor system availability Extracts and compiles data system monitoring data to create availability scorecards and reports System Monitoring: Continuously monitor IT systems to ensure optimal performance and availability, identifying and addressing potential issues before they escalate Monitoring and Maintenance: Regularly monitor production data assets to ensure they are functioning correctly and efficiently. Alerting applicable parties if an issue arises in production Issue Resolution: Work with data team to identify, diagnose, and resolve technical issues related to production data assets. Work with relevant teams to implement effective solutions Incident Management: Manage and prioritize incidents, ensuring that they are resolved promptly and efficiently and follow the incident management process. Document incidents and resolutions for future reference Incident Management: Respond to and resolve technical issues reported by users or automated monitoring alerts. This includes diagnosing problems, identifying solutions, and implementing fixes Problem Analysis: Analyze recurring issues to identify root causes and implement long-term solutions to prevent future occurrences Root Cause Analysis: Conduct thorough investigations to determine the underlying causes of recurring incidents and implement preventive measures Preventative Measures: Identify incidents that recur and put solutions in place to prevent recurrence Data Integrity: Work with data team to ensure the accuracy and integrity of data produced and provided to the business, work with the data teams to implement and maintain quality control measures to prevent errors Documentation: Maintain comprehensive documentation of processes, system configurations, and troubleshooting procedures. Ensure documentation is created and owned be it by the data team or the production support team Support: Provide support to data teams, data users and stakeholders. Respond to inquiries and assist with requests as applicable Optimization: Identify opportunities to optimize data production processes and implement improvements to enhance efficiency Performance Optimization: Analyze system performance and identify areas for improvement. Suggest and implement changes to enhance system efficiency and reliability. Requirements Qualifications/Skills Education: A bachelor's degree in computer science, information technology, or a related field is preferred Experience: Proven experience in data production support or a similar role. Familiarity with data production tools and technologies Technical Expertise: Strong knowledge of IT systems, applications, and troubleshooting techniques. Proficiency in relevant software and tools Technical Skills: Strong knowledge of database management, data warehousing, and ETL processes. Proficiency in programming languages such as SQL, Python, or Java Problem-Solving: Excellent analytical and problem-solving skills. Ability to diagnose and resolve technical issues efficiently Communication: Strong written and verbal communication skills. Ability to explain technical concepts to non-technical stakeholders Attention to Detail: High level of attention to detail and commitment to data accuracy Attention to Detail: Precision in monitoring systems and documenting incidents and solutions Team Player: Ability to work collaboratively in a team environment and build positive relationships with colleagues and stakeholders. Willingness to share knowledge and assist others Time Management: Strong organizational skills and the ability to manage multiple tasks and priorities effectively Adaptability: Flexibility to manage changing priorities and handle multiple tasks simultaneously Benefits This position comes with competitive compensation and benefits package: Competitive salary and performance-based bonuses Comprehensive benefits package Home Office model Career development and training opportunities Flexible work arrangements (remote and/or office-based) Dynamic and inclusive work culture within a globally known group Private Health Insurance Pension Plan Paid Time Off Training & Development *Note: Benefits differ based on employee level Show more Show less
Posted 1 day ago
5.0 - 7.0 years
18 - 20 Lacs
Mumbai
Work from Office
Work Timing: Standard IST 6 months Contract Exp in batch/real-time integrations with ODI 11g, customizing knowledge modules & design/development expertise Skilled in ETL processes, PL/SQL & building Interfaces, Packages, Load Plans & Sequences in ODI Required Candidate profile Exp in ODI Master & Work Repository, data modeling & ETL design, multi-system integration, error handling, automation & object migration in ODI Performance tuning, unit testing & debugging mappings Perks and benefits MNC
Posted 1 day ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At NXP innovation is in our DNA; every year we spend ~2B$ on R&D (~13.000 Engineers). The NPI tracker is a tool used to monitor and manage New Product Introduction (NPI) projects, ensuring that all tasks and milestones are on track. It provides visibility into project progress, budget, customer traction, resource allocation, requirements, test results and potential risks, helping teams to make informed decisions. The Business Analyst gathers and analyzes data, identifies trends, and provide insights to support decision-making processes. The Business Analyst will work closely with various stakeholders to ensure the successful implementation and continuous improvement of the NPI tracker. The main responsibility is the development and maintenance of Power BI reports that delight users: easy to understand, showing relevant business insights that lead to action. The Business Analyst will continuously improve the quality of the dashboards and the adoption of it. The ideal candidate has a passion for data and the high tech (semiconductor) industry, is technical savvy, likes to improve continuously and has a strong drive to deliver results. Key Responsibilities Data Analysis: Collect, analyze, and interpret data related to R&D projects to identify trends, issues, and opportunities for improvement. Reporting: Develop and maintain standard reports and dashboards to provide visibility into project progress, risks, and performance metrics. Stakeholder Collaboration: Work with project managers, resource managers, finance & strategy managers, IT teams, and other stakeholders to gather requirements, define project scope, and ensure alignment with business objectives. Process Improvement: Identify and recommend process improvements to enhance the efficiency and effectiveness of the NPI tracker. Documentation: Create and maintain comprehensive documentation, including business requirements, process flows, and user guides. Support: Provide ongoing support and training to users of the NPI tracker, addressing any issues or questions that arise. Qualifications Master’s degree in Computer Science, Information Systems, Business Administration, or a related field Proven technical savviness and data literacy Excellent data transformation and visualization skills in Power BI Desktop, Power BI Service, Power Query and DAX Proficient in databases, ETL, SQL and data modeling Knowledge in AWS, programming languages like python is a pre Strong analytical and problem-solving skills Excellent communication and collaboration skills Affinity with high technology fast-paced, dynamic semiconductor industry Preferred Skills Understanding of program management data, e.g. project schedule, resource allocation, time writing, requirements & test data, business case data. Familiarity with data governance frameworks and methodologies Experience with agile way of working and cross-functional team environment Hands-on experience in software development best practices (CI/CD), version control, including release management, testing and documentation About The CTO Office CTO Office is a small team (~30 people) that specialized on “R&D Craftmanship”. CTO office drives R&D efficiency & collaboration in NXP with ~13000 engineers through the following focus areas: Transparent programming, planning & cost-allocation Harmonized Processes, Methods and Tools NXP R&D Improvement & Strategic Programs ‘State of the art’ Analytics and Reporting, Technical Leadership and Program Management Culture More information about NXP in India... Show more Show less
Posted 1 day ago
5.0 years
0 Lacs
Dehradun, Uttarakhand, India
On-site
Cynoteck is currently hiring a Salesforce Technical Lead with an excellent interpersonal communication skills that has the relevant experience, knowledge and skillset. Key Responsibilities: Lead the end-to-end technical design, architecture, and implementation of Salesforce solutions. Collaborate with functional teams to understand business requirements and translate them into scalable and maintainable Salesforce solutions. Provide technical leadership and mentorship to Salesforce developers, guiding them through best practices and development challenges. Design and implement custom Salesforce applications, including complex workflows, process builders, flows, triggers, and integrations with third-party systems.Ensure adherence to Salesforce development standards and best practices. Lead Salesforce system upgrades, patches, and new feature releases, ensuring minimal disruption to operations. Manage data migration and integration strategies, including integration with other internal and external systems. Oversee testing strategies and ensure that all deliverables meet the required quality standards. Stay current with Salesforce updates, new features, and industry best practices, and evaluate their relevance to the business. Required Skills & Qualifications: Ability to work in a fast-paced, agile environment.Bachelor's degree in Computer Science, Information Technology, or a related field. 5+ years of experience working with Salesforce, including hands-on development experience in the Salesforce platform. Strong understanding of Salesforce architecture, data model, and capabilities. Excellent problem-solving, analytical, and troubleshooting skills. Experience leading and mentoring development teams. Expertise in Salesforce development tools: Apex, Visualforce, Lightning Web Components (LWC), SOQL, and Salesforce APIs. Experience with Salesforce integrations using REST/SOAP APIs, Middleware, or ETL tools.Proficient in Salesforce declarative configuration (Flows, Process Builder, Workflow Rules, etc.). Experience in deploying Salesforce changes using Salesforce DX, CI/CD processes, and change management tools. Strong understanding of security concepts in Salesforce (profiles, permission sets, roles, sharing rules). Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams. Hands-on experience with Salesforce Lightning, including Lightning Components and Lightning Experience is preferred. Salesforce Certifications (e.g., Salesforce Platform Developer are highly preferred. Show more Show less
Posted 1 day ago
2.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job: Project Scientist III (Data Analyst) Apply here Project title : NDMC Phase II: Developing Models to Estimate and Project Disease Burden to Inform Control and/or Elimination Strategies for Priority Diseases in India About the project: IIT Bombay is the anchor organization of the National Disease Modelling Consortium (NDMC). The consortium partners with various institutions in the country for disease modelling work. The objective of the project is to address policy and programmatic questions through India-specific disease models, to improve disease control and intervention strategies in the country. More information about the consortium can be found at www.ndmconsortium.com Essential Qualifications & Experience: PhD in data sciences with minimum 2 years relevant experience / MSc data sciences with 5 years experience OR PhD in Computer Engineering or any other engineering / MTech Data science, Computer science, Statistics or related field with minimum 3 years relevant experience/ BTech/BE/BDes or equivalent engineering degree with minimum 5 years of experience Desirable experience At least 8-10 years of experience in data science, with a proven track record in managing and supervising data science teams. Strong proficiency in R and Python for data analysis and modeling, along with extensive SQL skills for database management. Comprehensive understanding of Azure/cloud-based server and database development, with demonstrated experience in implementing cloud solutions Exceptional analytical abilities and familiarity with statistical techniques relevant to data interpretation. Experience with data visualization tools, such as Tableau and Power BI. Strong leadership, critical thinking, and problem-solving skills, with the ability to work collaboratively across teams. Job Profile: Lead and mentor a team of data scientists, facilitating collaboration, knowledge sharing, and professional growth to achieve project goals. Oversee the design and implementation of efficient database schemas and structures, ensuring optimal performance and scalability while maintaining data integrity and security. Guide the development, maintenance, and optimization of databases that support data storage and retrieval processes, including executing complex SQL queries for data manipulation. Drive the development of advanced data analysis and modeling, utilizing programming languages such as R and Python. Manage ETL (Extract, Transform, Load) processes to ensure data is prepared for analysis and reporting efficiently. Hands on experience working with data systems and platforms such as Azure Data Factory, Azure Storage, Data Lake, Azure Synapse, Databricks or similar Solid understanding of Cloud based data management, data analytics principles and tools Promote the use of data visualization tools (such as Tableau and Power BI) to communicate effectively and enhance understanding across the organization. Clearly communicate complex methodologies, findings, and analytical results to multidisciplinary teams and external partners. Contribute to the preparation of comprehensive research reports and presentations. Pay Details: Consolidated salary 78,000+HRA (if applicable) p.m. Show more Show less
Posted 1 day ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Position Title: Data Scientist Location: Gurugram Experience: 3–4 Years Job Type: Full-Time Company: Sequifi | www.sequifi.com About the Role We are hiring a Data Scientist with 3–4 years of experience who has a strong foundation in data analysis, machine learning, and business problem-solving. This role is ideal for someone who is hands-on with modern tools and techniques, eager to explore new technologies, and enjoys working in a collaborative, fast-paced environment. Key Responsibilities Analyze complex datasets to uncover patterns, trends, and actionable insights. Build, validate, and deploy machine learning models for predictive analytics, classification, and clustering. Design and maintain efficient data pipelines and ETL processes. Create clear, interactive dashboards and reports using tools such as Power BI, Tableau, or Python visualization libraries. Collaborate with product managers, developers, and business analysts to understand requirements and deliver data-driven solutions. Conduct A/B testing and statistical analysis to support product decisions and optimizations. Continuously improve model performance based on feedback and business objectives. Required Qualifications Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, or a related field. 3–4 years of experience in data science or a similar role. Strong programming skills in Python (Pandas, NumPy, Scikit-learn), SQL, and familiarity with PHP or Node.js. Hands-on experience with data visualization tools such as Power BI or Tableau. Good understanding of machine learning algorithms, data preprocessing, and feature engineering. Experience working with structured and unstructured data, including NoSQL databases like MongoDB. Familiarity with cloud platforms such as AWS, GCP, or Azure is a plus. Soft Skills Strong analytical and problem-solving abilities. Effective communication and data storytelling skills. Ability to work independently as well as collaboratively in cross-functional teams. A mindset geared toward innovation, learning, and adaptability. Why Join Us Work on meaningful and challenging problems in a tech-focused environment. Join a young, supportive, and fast-moving team. Gain exposure to a combination of data science, product, and engineering. Opportunity to learn and grow continuously in a culture of innovation. If you’re passionate about using data to drive business impact, we’d love to hear from you. Apply now and grow with us at Sequifi. Show more Show less
Posted 1 day ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description NXP Semiconductors enables secure connections and infrastructure for a smarter world, advancing solutions that make lives easier, better and safer. As the world leader in secure connectivity solutions for embedded applications, we are driving innovation in the secure connected vehicle, end-to-end security & privacy and smart connected solutions markets. Organization Description Do you feel challenged by being part of the IT department of NXP, the company with a mission of “Secure Connections for a Smarter World”? Do you perform best in a role representing IT in projects in a fast moving, international environment? Within R&D IT Solutions, the Product Creation Applications (PCA) department is responsible for providing and supporting the R&D design community globally with best-in-class applications and support. The applications are used by over 6,000 designers. Job Summary As a Graph Engineer, you will: Develop pipelines and code to support the ingress and egress of this data to and from the knowledge graphs. Perform basic and advanced graph querying and data modeling on the knowledge graphs that lie at the heart of the organization's Product Creation ecosystem. Maintain the (ETL) pipelines, code and Knowledge Graph to stay scalable, resilient and performant in line with customer’s requirements. Work in an international and Agile DevOps environment. This position offers an opportunity to work in a globally distributed team where you will get a unique opportunity of personal development in a multi-cultural environment. You will also get a challenging environment to develop expertise in the technologies useful in the industry. Primary Responsibilities Translate requirements of business functions into “Graph-Thinking”. Build and maintain graphs and related applications from data and information, using latest graph technologies to leverage high value use cases. Support and manage graph databases. Integrate graph data from various sources – internal and external. Extract data from various sources, including databases, APIs, and flat files. Load data into target systems, such as data warehouses and data lakes. Develop code to move data (ETL) from the enterprise platform applications into the enterprise knowledge graphs. Optimize ETL processes for performance and scalability. Collaborate with data engineers, data scientists and other stakeholders to model the graph environment to best represent the data coming from the multiple enterprise systems. Skills / Experience Semantic Web technologies: RDF RDFS, OWL, SHACL SPARQL JSON-LD, N-Triples/N-Quads, Turtle, RDF/XML, TriX API-led architectures REST, SOAP Microservices API Management Graph databases, such as Dydra, Amazon Neptune, Neo4J, Oracle Spatial & Graph is a plus Experience with other NoSQL databases, such as key-value databases and document-based databases (e.g. XML databases) is a plus Experience with relational databases Programming experience, preferably Java, JavaScript, Python, PL/SQL Experience with web technologies: HTML, CSS, XML, XSLT, XPath Experience with modelling languages such as UML Understanding of CI/CD automation, version control, build automation, testing frameworks, static code analysis, IT service management, artifact management, container management, and experience with related tools and platforms. Familiarity with Cloud computing concepts (e.g. in AWS and Azure). Education & Personal Skillsets A master’s or bachelor’s degree in the field of computer science, mathematics, electronics engineering or related discipline with at least 10 plus years of experience in a similar role Excellent problem-solving and analytical skills A growth mindset with a curiosity to learn and improve. Team player with strong interpersonal, written, and verbal communication skills. Business consulting and technical consulting skills. An entrepreneurial spirit and the ability to foster a positive and energized culture. You can demonstrate fluent communication skills in English (spoken and written). Experience working in Agile (Scrum knowledge appreciated) with a DevOps mindset. More information about NXP in India... Show more Show less
Posted 1 day ago
3.0 - 8.0 years
4 - 9 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Job Title ETL/DWT Test Lead Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills: Technology->E testing/Datawarehouse Testing Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements BTECH/BE/MTECH/ME/BCA/MCA/BSC/MSC
Posted 1 day ago
2.0 - 4.0 years
4 - 6 Lacs
Hyderabad
Work from Office
This role involves the development and application of engineering practice and knowledge in defining, configuring and deploying industrial digital technologies (including but not limited to PLM and MES) for managing continuity of information across the engineering enterprise, including design, industrialization, manufacturing and supply chain, and for managing the manufacturing data. Job Description - Grade Specific Focus on Digital Continuity and Manufacturing. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Role : Data Engineer Employment Type : Full time Timing : General Work Mode : Work from Office Experience: 4 - 8 Years Location : Ahmedabad Notice period: Only immediate joiner and Only serving noticer ( Join before 30 June 2025) Role and Responsibilities: • Provide business analytics support to the Management team • Analyse business results and manage studies to collect relevant data • Design, build, and maintain data pipelines and ETL processes using Python as part of larger data platform projects • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure data quality. • Optimize database performance, including indexing, partitioning, and query optimization. • Implement data governance and security measures to protect sensitive data. • Monitor data pipelines, troubleshoot issues, and perform data validation. • Develop and maintain documentation for data processes and workflows. Skills Required: • Proficiency in Python for data processing and scripting. • Strong SQL knowledge and experience with relational databases (e.g., MySQL, PostgreSQL, SQL Server) Understanding of data modelling, data warehousing, and data architecture. • Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. • Proficiency in working with GCP (especially Big Query and GCS). • Version control skills using Git. MUST HAVE : Data Engineeing GCP (Big query, GCS, Data flow, Airflow) Python SQL (MYSQL / Prostgres SQL /SQL) Data Modelling Data warehousing Data Architect Git Show more Show less
Posted 1 day ago
18.0 years
0 Lacs
Greater Kolkata Area
On-site
About The Company e.l.f. Beauty, Inc. stands with every eye, lip, face and paw. Our deep commitment to clean, cruelty free beauty at an incredible value has fueled the success of our flagship brand e.l.f. Cosmetics since 2004 and driven our portfolio expansion. Today, our multi-brand portfolio includes e.l.f. Cosmetics, e.l.f. SKIN, pioneering clean beauty brand Well People, Keys Soulcare, a groundbreaking lifestyle beauty brand created with Alicia Keys and Naturium, high-performance, biocompatible, clinically-effective and accessible skincare. In our Fiscal year 24, we had net sales of $1 Billion and our business performance has been nothing short of extraordinary with 24 consecutive quarters of net sales growth. We are the #2 mass cosmetics brand in the US and are the fastest growing mass cosmetics brand among the top 5. Our total compensation philosophy offers every full-time new hire competitive pay and benefits, bonus eligibility (200% of target over the last four fiscal years), equity, flexible time off, year-round half-day Fridays, and a hybrid 3 day in office, 2 day at home work environment. We believe the combination of our unique culture, total compensation, workplace flexibility and care for the team is unmatched across not just beauty but any industry. Visit our Career Page to learn more about our team: https://www.elfbeauty.com/work-with-us Job Summary: We’re looking for a strategic and technically strong Senior Data Architect to join our high-growth digital team. The selected person will play a critical role in shaping the company’s global data architecture and vision. The ideal candidate will lead enterprise-level architecture initiatives, collaborate with engineering and business teams, and guide a growing team of engineers and QA professionals. This role involves deep engagement across domains including Marketing, Product, Finance, and Supply Chain, with a special focus on marketing technology and commercial analytics relevant to the CPG/FMCG industry. The candidate should bring a hands-on mindset, a proven track record in designing scalable data platforms, and the ability to lead through influence. An understanding of industry-standard frameworks (e.g., TOGAF), tools like CDPs, MMM platforms, and AI-based insights generation will be a strong plus. Curiosity, communication, and architectural leadership are essential to succeed in this role. Key Responsibilities Enterprise Data Strategy: Design, define and maintain a holistic data strategy & roadmap that aligns with corporate objectives and fuels digital transformation. Ensure data architecture and products aligns with enterprise standards and best practices Data Governance & Quality: Establish scalable governance frameworks to ensure data accuracy, privacy, security, and compliance (e.g., GDPR, CCPA). Oversee quality, security and compliance initiatives Data Architecture & Platforms: Oversee modern data infrastructure (e.g., data lakes, warehouses, streaming) with technologies like Snowflake, Databricks, AWS, and Kafka Marketing Technology Integration: Ensure data architecture supports marketing technologies and commercial analytics platforms (e.g., CDP, MMM, ProfitSphere) tailored to the CPG/FMCG industry Architectural Leadership: Act as a hands-on architect with the ability to lead through influence. Guide design decisions aligned with industry best practices and e.l.f.'s evolving architecture roadmap Cross-Functional Collaboration: Partner with Marketing, Supply Chain, Finance, R&D, and IT to embed data-driven practices and deliver business impact. Lead integration of data from multiple sources to unified data warehouse. Cloud Optimization : Optimize data flows, storage for performance and scalability. Lead data migration priorities, manage metadata repositories and data dictionaries. Optimise databases and pipelines for efficiency. Manage and track quality, cataloging and observability AI/ML Enablement: Drive initiatives to operationalize predictive analytics, personalization, demand forecasting, and more using AI/ML models. Evaluate emerging data technologies and tools to improve data architecture Team Leadership: Lead, mentor, and enable high-performing team of data engineers, analysts, and partners through influence and thought leadership Vendor & Tooling Strategy: Manage relationships with external partners and drive evaluations of data and analytics tools Executive Reporting: Provide regular updates and strategic recommendations to executive leadership and key stakeholders Data Enablement : Design data models, database structures, and data integration solutions to support large volumes of data Qualifications And Requirements Bachelor's or Master's degree in Computer Science, Information Systems, or a related field 18+ years of experience in Information Technology 8+ years of experience in data architecture, data engineering, or a related field, with a focus on large-scale, distributed systems Strong understanding of data use cases in the CPG/FMCG sector. Experience with tools such as MMM (Marketing Mix Modeling), CDPs, ProfitSphere, or inventory analytics preferred Awareness of architecture frameworks like TOGAF. Certifications are not mandatory, but candidates must demonstrate clear thinking and experience in applying architecture principles Must possess excellent communication skills and a proven ability to work cross-functionally across global teams. Should be capable of leading with influence, not just execution Knowledge of data warehousing, ETL/ELT processes, and data modeling Deep understanding of data modeling principles, including schema design and dimensional data modeling Strong SQL development experience including SQL Queries and stored procedures Ability to architect and develop scalable data solutions, staying ahead of industry trends and integrating best practices in data engineering Familiarity with data security and governance best practices Experience with cloud computing platforms such as Snowflake, AWS, Azure, or GCP Excellent problem-solving abilities with a focus on data analysis and interpretation Strong communication and collaboration skills Ability to translate complex technical concepts into actionable business strategies Proficiency in one or more programming languages such as Python, Java, or Scala This job description is intended to describe the general nature and level of work being performed in this position. It also reflects the general details considered necessary to describe the principal functions of the job identified, and shall not be considered, as detailed description of all the work required inherent in the job. It is not an exhaustive list of responsibilities, and it is subject to changes and exceptions at the supervisors’ discretion. e.l.f. Beauty respects your privacy. Please see our Job Applicant Privacy Notice (www.elfbeauty.com/us-job-applicant-privacy-notice) for how your personal information is used and shared. Show more Show less
Posted 1 day ago
5.0 - 10.0 years
0 - 2 Lacs
Pune, Chennai
Work from Office
Mandatory Skill: 1. Spark 2. SQL 3. Python JD: Must Have: • Relevant experience of 5-8yrs as a Data Engineer. • Preferred experience in related technologies as follows: • SQL: 2-4 years of experience • Spark: 1-2 years of experience • NoSQL Databases: 1-2 years of experience • Database Architecture: 2-3 years of experience • Cloud Architecture: 1-2 years of experience • Experience in programming language like Python • Good Understanding of ETL (Extract, Transform, Load) concepts • Good analytical and problem-solving skills • Inclination for learning & be self-motivated. • Knowledge of ticketing tool like JIRA/SNOW. • Good communication skills to interact with Customers on issues & requirements. Good to Have: • Knowledge/Experience in Scala.
Posted 1 day ago
3.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
TCS Hiring !!! Role: Google Data Engineer Experience: 3-5 years Location: Hyderabad & Chennai Job Description: Experience level of 3 to 5 years of relevant experience in data engineering, data warehousing, or a related field. Experience with dashboarding tools like plx dashboard and looker studio Experience with building data pipelines, reports, best practices and frameworks. Experience with design and development of scalable and actionable solutions (dashboards, automated collateral, web applications). Experience with code refactoring for optimal performance. Experience writing and maintaining ETLs which operate on a variety of structured and unstructured sources. Familiarity with non-relational data storage systems (NoSQL and distributed database management systems). Strong proficiency in SQL, NoSQL, ETL tools, Big Query and at least one programming language (e.g., Python, Java). Strong understanding of data structures, algorithms, and software design principles. Experience with data modeling techniques and methodologies. Proficiency in troubleshooting and debugging complex data-related issues. Ability to work independently and as part of a team. Experience Cloud Storage or equivalent cloud platforms Knowledge of Big Query ingress and egress patterns Experience in writing Airflow DAGs Knowledge of pub sub, dataflow or any declarative data pipeline tools using batch and streaming ingestion Other GCP Services: Vertex AI Show more Show less
Posted 1 day ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Company : Our client is a global IT, consulting, and business process services company headquartered in Bengaluru, India. It offers end-to-end IT services, including application development, infrastructure management, and digital transformation. They serves clients across industries such as banking, healthcare, retail, energy, and manufacturing. It specializes in modern technologies like cloud computing, AI, data analytics, and cybersecurity. The company has a strong global presence, operating in over 66 countries. Our client employs more than 250,000 people worldwide. It is known for helping enterprises modernize their IT infrastructure and adopt agile practices. Their division includes consulting, software engineering, and managed services. The company integrates automation and AI into its services to boost efficiency and innovation. Job Title: Datastage developer · Location: Pune(Hybrid) · Experience: 6+ yrs · Job Type : Contract to hire. · Notice Period:- Immediate joiners. Mandatory Skills: DataStage Developer Responsibilities: Reviewing and discussing briefs with key personnel assigned to projects. Designing and building scalable DataStage solutions. Configuring clustered and distributed scalable parallel environments. Updating data within repositories, data marts, and data warehouses. Assisting project leaders in determining project timelines and objectives. Monitoring jobs and identifying bottlenecks in the data processing pipeline. Testing and troubleshooting problems in ETL system designs and processes. Improving existing ETL approaches and solutions used by the company. Providing support to customers about issues relating to the storage, handling, and access of data. DataStage Developer Requirements: Bachelor's degree in computer science, information systems, or a similar field. Demonstrable experience as a DataStage developer. IBM DataStage certification or similar type of qualification. Proficiency in SQL or another relevant coding language. Experience or understanding of other ETL tools, such as Informatica, Oracle ETL, or Xplenty. Knowledge of data modeling, database design, and the data warehousing ecosystem. Skilled at the ideation, design, and deployment of DataStage solutions. Excellent analytical and problem-solving skills. The ability to work within a multidisciplinary team. Show more Show less
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are hiring for one the IT big4 consulting Designation: - Associate/Associate Consultant Location : - Chennai/Gurgaon/Pune Skills Req-: AWS (Big Data services) - S3, Glue, Athena, EMR Programming - Python, Spark, SQL, Mulesoft,Talend, Dbt Data warehouse - ETL, Redshift / Snowflake Key Responsibilities: - Work with business stakeholders to understand their business needs. - Create data pipelines that extract, transform, and load (ETL) from various sources into a usable format in a Data warehouse. - Clean, filter, and validate data to ensure it meets quality and format standards. - Develop data model objects (tables, views) to transform the data into unified format for downstream consumption. - Expert in monitoring, controlling, configuring, and maintaining processes in cloud data platform. - Optimize data pipelines and data storage for performance and efficiency. - Participate in code reviews and provide meaningful feedback to other team members. - Provide technical support and troubleshoot issue(s). Qualifications: - Bachelor’s degree in computer science, Information Technology, or a related field, or equivalent work experience. - Experience working in the AWS cloud platform. - Data engineer with expertise in developing big data and data warehouse platforms. - Experience working with structured and semi-structured data. - Expertise in developing big data solutions, ETL/ELT pipelines for data ingestion, data transformation, and optimization techniques. - Experience working directly with technical and business teams. - Able to create technical documentation. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities. Skillset (good to have) - Experience in data modeling. - Certified in AWS platform for Data Engineer skills. - Experience with ITSM processes/tools such as ServiceNow, Jira - Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow Show more Show less
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Proficiency in AI tools used to prepare and automate data pipelines and ingestion Apache Spark, especially with MLlib PySpark and Dask for distributed data processing Pandas and NumPy for local data wrangling Apache Airflow – schedule and orchestrate ETL/ELT jobs Google Cloud (BigQuery, Vertex AI) Python (most popular for AI and data tasks) Show more Show less
Posted 1 day ago
6.0 - 11.0 years
15 - 20 Lacs
Pune
Hybrid
Role & responsibilities B.Tech or M.Tech in Computer Science, or equivalent experience. 5+ years of experience working professionally as a Python Software Developer. Organized, self-directed, and resourceful. Excellent written and verbal communication skills. Expert in python & pandas. Experience in building data pipelines, ETL and ELT processes. Advanced working SQL experience working with relational databases, query authoring (SQL) and working familiarity with a variety of databases. Understanding of Docker and Data Orchestration tools. Experience with Jupyter notebooks.
Posted 1 day ago
6.0 - 10.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Key Skills: SQL, ETL Roles & Responsibilities: Develop and maintain data solutions using ETL tools, SQL, and at least one programming or reporting technology (Python, Java, React, MSTR, Tableau, or PowerBI). Work on cloud platforms like AWS or any other public cloud environment for data operations and analytics. Debug and resolve complex incidents reported by clients, ensuring SLA compliance. Prepare and present status updates during client meetings and address queries. Collaborate with operations teams to optimize performance and ensure operational stability. Provide technical mentorship and guidance to team members; foster a high-performance culture. Perform task management, monitor team deliverables, and track SLA performance. Utilize tools such as ServiceNow and SharePoint for documentation and workflow management. Follow the Software Development Life Cycle (SDLC) to ensure structured delivery of solutions. Lead and coach a team of 10-15 members effectively. Encourage a learning mindset within the team and explore new domains or technologies. Demonstrate strong attention to detail and commitment to quality. Exhibit excellent communication, documentation, and stakeholder management skills. Expirence Requirements: 6-10 years of experience in IT with a focus on data analytics and development. Experience working with AWS or any public cloud platforms. Hands-on experience with at least one programming or visualization tool (Python/Java/React/MSTR/Tableau/PowerBI). Proficiency in SQL and database tools is mandatory. Experience with ETL processes and tools. Familiarity with ServiceNow and SharePoint. Solid understanding of SDLC practices. Experience in leading or mentoring a team of 10-15 members. Strong planning, organizing, and task management abilities. Excellent verbal and written communication skills. Ability to work independently and collaboratively in a fast-paced environment. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field.
Posted 1 day ago
5.0 years
0 Lacs
Pimpri Chinchwad, Maharashtra, India
On-site
Job Title: Support Specialist – Eagle Platform (Portfolio Management) Location: Riyadh, Saudi Arabia Type: Full-time / Contract Industry: Banking / Investment Management / FinTech Experience Required: 5+ years We are seeking a highly skilled Support Specialist with hands-on experience working on BNY Mellon’s Eagle Investment Systems , particularly the Eagle STAR, PACE, and ACCESS modules used for portfolio accounting, data management, and performance reporting . The ideal candidate will have supported the platform in banking or asset management environments, preferably with experience at Bank of America , BNY Mellon , or institutions using Eagle for middle- and back-office operations . Key Responsibilities Provide day-to-day technical and functional support for the Eagle Platform including STAR, PACE, and Performance modules Troubleshoot and resolve user issues related to portfolio accounting, performance calculation, and reporting Act as a liaison between business users and technical teams for change requests, data corrections, and custom reports Monitor batch jobs, data feeds (security, pricing, transaction data), and system interfaces Work closely with front-office, middle-office, and operations teams to ensure accurate data processing and reporting Manage SLA-driven incident resolution and maintain support documentation Support data migrations, upgrades, and new release rollouts of Eagle components Engage in root cause analysis and implement preventive measures Required Skills And Experience 5+ years of experience in financial systems support, with a strong focus on Eagle Investment Systems Strong knowledge of portfolio management processes, NAV calculations, and financial instruments (equities, fixed income, derivatives) Prior work experience in Bank of America, BNY Mellon, or with asset managers using Eagle is highly preferred Proficient in SQL, ETL tools, and understanding of data architecture in financial environments Familiarity with upstream/downstream systems such as Bloomberg, Aladdin, or CRD is a plus Strong analytical skills and attention to detail Excellent communication skills in English (Arabic is a plus) Preferred Qualifications Bachelor’s degree in Computer Science, Finance, or related field ITIL Foundation or similar certification in service management Prior experience working in a banking or asset management firm in the GCC is a bonus Show more Show less
Posted 1 day ago
0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Key Skills & Qualifications: Strong experience in SAP BW (preferably BW/4HANA) – data modeling, extraction, transformation, and loading (ETL). Hands-on experience in SAP Analytics Cloud (SAC) – building stories, dashboards, and predictive analytics. Proficiency in integrating BW data into SAC and managing data connections. Solid understanding of HANA views, CDS views, and ABAP for BW enhancements. Good knowledge of SAP ECC/S4HANA data sources and business processes. Experience in Agile delivery methodology is a plus. Strong analytical, problem-solving, and communication skills. Ability to work independently and in cross-functional teams. Preferred Qualifications: SAP BW and/or SAC certification. Prior experience in a client-facing delivery or consulting role. Experience with BOBJ, Analysis for Office, or other reporting tools is a plus. Show more Show less
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.
These cities are known for their thriving tech industries and often have a high demand for ETL professionals.
The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.
In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect
As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.
Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)
Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.
Here are 25 interview questions that you may encounter in ETL job interviews:
As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.