Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
14 - 16 Lacs
Chennai
Work from Office
1. Above 5+yrs of experience in Oracle Database Administration of 11g/12c. 2.Good practical exp in monitoring, tuning a database to provide a high availability service 3. Exp of managing multiple RDBMS on large systems. 4. Must have knowledge in Managing RMAN backups 5. Having knowledge of IBM-Netezza, MicroStrategy, GCP-BigQuery administration will be an added advantage 6. Having experience in managing any ETL tool will be an added advantage
Posted 3 weeks ago
2.0 - 6.0 years
3 - 8 Lacs
Pune, Sangli
Work from Office
We are looking for a Data Science Engineer with strong experience in ETL development and Talend to join our data and analytics team. The ideal candidate will be responsible for designing robust data pipelines, enabling analytics and AI solutions, and working on scalable data science projects that drive business value. Key Responsibilities: Design, build, and maintain ETL pipelines using Talend Data Integration . Extract data from multiple sources (databases, APIs, flat files) and load it into data warehouses or lakes. Ensure data integrity , quality , and performance tuning in ETL workflows. Implement job scheduling, logging, and exception handling using Talend and orchestration tools. Prepare and transform large datasets for analytics and machine learning use cases. Build and deploy data pipelines that feed predictive models and business intelligence platforms. Collaborate with data scientists to operationalize ML models and ensure they run efficiently at scale. Assist in feature engineering , data labeling , and model monitoring processes. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field. 3+ years of experience in ETL development , with at least 2 years using Talend . Proficiency in SQL , Python (for data transformation or automation) Hands-on experience with data integration , data modeling , and data warehousing . Must have Strong Knowledge of cloud platforms such as AWS , Azure , or Google Cloud . Familiarity with big data tools like Spark, Hadoop, or Kafka is a plus.
Posted 3 weeks ago
5.0 - 7.0 years
5 - 8 Lacs
Pune
Work from Office
Job Summary: Cummins is seeking a skilled Data Engineer to support the development, maintenance, and optimization of our enterprise data and analytics platform. This role involves hands-on experience in software development , ETL processes , and data warehousing , with strong exposure to tools like Snowflake , OBIEE , and Power BI . The engineer will collaborate with cross-functional teams, transforming data into actionable insights that enable business agility and scale. Please NoteWhile the role is categorized as remote, it will follow a hybrid work model based out of our Pune office . Key Responsibilities: Design, develop, and maintain ETL pipelines using Snowflake and related data transformation tools. Build and automate data integration workflows that extract, transform, and load data from various sources including Oracle EBS and other enterprise systems. Analyze, monitor, and troubleshoot data quality and integrity issues using standardized tools and methods. Develop and maintain dashboards and reports using OBIEE , Power BI , and other visualization tools for business stakeholders. Work with IT and Business teams to gather reporting requirements and translate them into scalable technical solutions. Participate in data modeling and storage architecture using star and snowflake schema designs. Contribute to the implementation of data governance , metadata management , and access control mechanisms . Maintain documentation for solutions and participate in testing and validation activities. Support migration and replication of data using tools such as Qlik Replicate and contribute to cloud-based data architecture . Apply agile and DevOps methodologies to continuously improve data delivery and quality assurance processes. Why Join Cummins? Opportunity to work with a global leader in power solutions and digital transformation. Be part of a collaborative and inclusive team culture. Access to cutting-edge data platforms and tools. Exposure to enterprise-scale data challenges and finance domain expertise . Drive impact through data innovation and process improvement . Competencies Data Extraction & Transformation - Ability to perform ETL activities from varied sources with high data accuracy. Programming - Capable of writing and testing efficient code using industry standards and version control systems. Data Quality Management - Detect and correct data issues for better decision-making. Solution Documentation - Clearly document processes, models, and code for reuse and collaboration. Solution Validation - Test and validate changes or solutions based on customer requirements. Problem Solving - Address technical challenges systematically to ensure effective resolution and prevention. Customer Focus - Understand business requirements and deliver user-centric data solutions. Communication & Collaboration - Work effectively across teams to meet shared goals. Values Differences - Promote inclusion by valuing diverse perspectives and backgrounds. Education, Licenses, Certifications Bachelor s or Master s degree in Computer Science, Information Systems, Data Engineering, or a related technical discipline. Certifications in data engineering or relevant tools (Snowflake, Power BI, etc.) are a plus. Experience Must have skills 5-7 years of experience in data engineering or software development , preferably within a finance or enterprise IT environment. Proficient in ETL tools , SQL , and data warehouse development . Proficient in Snowflake , Power BI , and OBIEE reporting platforms. Must have worked in implementation using these tools and technologies. Strong understanding of data warehousing principles , including schema design (star/snowflake), ER modeling, and relational databases. Working knowledge of Oracle databases and Oracle EBS structures. Preferred Skills: Experience with Qlik Replicate , data replication , or data migration tools. Familiarity with data governance , data quality frameworks , and metadata management . Exposure to cloud-based architectures, Big Data platforms (e.g., Spark, Hive, Kafka), and distributed storage systems (e.g., HBase, MongoDB). Understanding of agile methodologies (Scrum, Kanban) and DevOps practices for continuous delivery and improvement.
Posted 3 weeks ago
2.0 - 5.0 years
7 - 12 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: Develop and support new feeds ingestion / understand the existing framework and do the development as per the business rules and requirements. Development and maintenance of new changes / enhancements in Data Ingestion / Juniper and promoting and supporting those in the production environment within the stipulated timelines. Need to get familiar with the Data Ingestion / Data Refinery / Common Data Model / Compdata frameworks quickly and contribute to the application development as soon as possible. Methodical and measured approach with a keen eye for attention to detail; Ability to work under pressure and remain calm in the face of adversity; Ability to collaborate, interact and engage with different business, technical and subject matter experts; Good, concise, written and verbal communication Ability to manage workload from multiple requests and to balance priorities; Pro-active, a can do mind-set and attitude; Good documentation skills; Requirements To be successful in this role, you should meet the following requirements: Experience (1 = essential, 2 = very useful, 3 = nice to have): 1. Ab Initio / ETL 2. Hadoop / GCP 2. Agile / Scrum 3. LINUX Technical skills (1 = essential, 2 = useful, 3 = nice to have): 1. Any ETL tool 1. Analytical trouble shooting. 2. Hive QL
Posted 3 weeks ago
3.0 - 8.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Immediate requirement for IT System/Software Engineer for one of the leading Pharma sector Position - IT System/Software Engineer Experience - 4 to 6 years of relevant; 3 years of experience with Power BI/ Altryx Education - MBA with Graduate / Postgraduate in Medicine (MD/MBBS/BDS) Pharmacy / Life Sciences/ bachelors degree in marketing, Statistics, or a related field. Location - Onsite Hyderabad Tenure - 6 -12 months ; extension based on performance Skills: • SQL, Advanced Excel & Power point • ETL tools (Alteryx, DataIKU, VBA, etc.) • Advanced Power BI • Good to have knowledge of Python and/or Snowflake. • Statistical Analysis • AI/ML • Process Documentation CTC: 10 - 14 LPA Description: Key Responsibilities: • Possess strong analytical skills to collect, organize, analyse, and disseminate significant amounts of information • Interpret complex and granular data, analyse results and derive actionable insights. Clearly communicate data-driven insights to stakeholders and influence decision-making processes • Develop, maintain, and improve accurate, actionable, and insightful reporting solutions and dashboards • Manage and organize data sets from databases to find patterns and trends in data while ensuring data integrity and accuracy • Use data analytics to understand customer behaviour and improve marketing effectiveness • Drive standardization of reports across brands • Establish and maintain positive relationships with key stakeholders and understand their perspectives • Conduct extensive business process analysis to identify areas for process improvement and efficiencies • Stay informed on industry trends and developments to advise management on strategies for business growth • Build and maintain standard operating procedures (SOPs), quality checklists to enable excellent quality outputs and knowledge repositories Essential Requirements: • 4-6 years of proven ability in business analytics in a market research firm or pharmaceutical company or Pharma KPO/ Consulting • 4-6 years of overall experience in digital marketing, web analytics, good knowledge of data modelling, SQL and robust technical problem-solving skills. • Expertise in MS Excel, SQL, Power Query, and ETL tools like Alteryx, DataIKU, VBA, or KNIME. Knowledge of Statistical modelling or ML is preferred • 3+ years of extensive experience working with Power BI • Proficiency in statistical analysis tools (R, Python, or similar) will be preferred • High agility to work across projects, dataset and technologies • Excellent presentation and stakeholder management skills • Exceptional written and verbal communication skills, with the ability to translate complex data into actionable insights • Ability to operate optimally in an international matrix environment. Strong teammate who is dynamic and result oriented • Understanding of healthcare terminology and real-world patient level data will be desirable • Ability to multi-task, work in a demanding global team environment, work under tight deadlines. Develop and maintain strong individual and team performance. • Preferred: Knowledge of the disease areas within the Pharma sector, with strong leadership and communication skills Interested candidates share cv : busiraju.sindhu@manpower.co.in Whats app : 7013970562
Posted 3 weeks ago
5.0 - 10.0 years
13 - 22 Lacs
Chennai
Work from Office
Employment Type: 6 Months Contract Job Summary: Looking for an experienced MySQL DBA & Developer to design, maintain, and optimize MySQL databases for high-volume applications. Must be capable of managing large-scale data operations (in TBs), cloud/on-prem deployments, ETL pipelines, and ensuring high availability, performance, and security. Key Responsibilities: Database Administration Install, configure, upgrade MySQL (v5.5v8.0) on VM, systems, and Azure Cloud. Manage backup, recovery, DRDP, auto-recovery, and replication setups. Tune performance (queries, indexes, server settings); ensure database security and access control. Monitor systems using tools like Percona Toolkit, Nagios, or Prometheus. Maintain architecture documentation, cronjobs, and archiving strategies. MySQL Development Write complex SQL queries, views, triggers, procedures; design schemas. Create & optimize ETL pipelines for data loading/extraction (including staged loads). Handle large-volume data migration, cleaning, normalization, and modeling. Support analytics and report development (Superset, Power BI, Tableau preferred). Implement schema changes, assist DR planning, and support app integration. Technical Skills Required: MySQL (v5.5/8.0), SQL, indexing, encryption, query tuning ETL tools, SSIS, scripting (Python, Bash), Linux/Unix Cloud (Azure preferred, AWS/GCP a plus), job schedulers (Autosys, cron) Familiar with high availability, clustering, sharding, failover systems Experience with ServiceNow, JIRA, DevOps, and BI tools is a plus Soft Skills: Strong problem-solving, communication, stakeholder handling Ability to work under pressure and support distributed teams Detail-oriented with a focus on performance, reliability, and security
Posted 3 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Remote
Hi Candidates, We have Job openings in one of our MNC Company -Remote-C2h Please apply here or share updated resume to chandrakala.c@i-q.co AWS Data engineer JD: Data Engineer JD The requirements for the candidate: Data Engineer with a minimum of 3 - 5+ years of experience of data engineering experience. The role will require deep knowledge of data engineering techniques to create data pipelines and build data assets. At least 4+ years of Strong hands on programming experience with Pyspark / Python / Boto3 including Python Frameworks, libraries according to python best practices. Strong experience in code optimisation using spark SQL and pyspark. Understanding of Code versioning ,Git repository , JFrog Artifactory. AWS Architecture knowledge specially on S3, EC2, Lambda, Redshift, CloudFormation etc and able to explain benefits of each Code Refactorization of Legacy Codebase: Clean, modernize, improve readability and maintainability. Unit Tests/TDD: Write tests before code, ensure functionality, catch bugs early. Fixing Difficult Bugs: Debug complex code, isolate issues, resolve performance, concurrency, or logic flaws.Role & responsibilities
Posted 3 weeks ago
4.0 - 9.0 years
6 - 10 Lacs
Bengaluru
Work from Office
About the Role: We are seeking a skilled and detail-oriented Data Migration Specialist with hands-on experience in Alteryx and Snowflake. The ideal candidate will be responsible for analyzing existing Alteryx workflows, documenting the logic and data transformation steps and converting them into optimized, scalable SQL queries and processes in Snowflake. The ideal candidate should have solid SQL expertise, a strong understanding of data warehousing concepts. This role plays a critical part in our cloud modernization and data platform transformation initiatives. Key Responsibilities: Analyze and interpret complex Alteryx workflows to identify data sources, transformations, joins, filters, aggregations, and output steps. Document the logical flow of each Alteryx workflow, including inputs, business logic, and outputs. Translate Alteryx logic into equivalent SQL scripts optimized for Snowflake, ensuring accuracy and performance. Write advanced SQL queries , stored procedures, and use Snowflake-specific features like Streams, Tasks, Cloning, Time Travel , and Zero-Copy Cloning . Implement data ingestion strategies using Snowpipe , stages, and external tables. Optimize Snowflake performance through query tuning , partitioning, clustering, and caching strategies. Collaborate with data analysts, engineers, and stakeholders to validate transformed logic against expected results. Handle data cleansing, enrichment, aggregation, and business logic implementation within Snowflake. Suggest improvements and automation opportunities during migration. Conduct unit testing and support UAT (User Acceptance Testing) for migrated workflows. Maintain version control, documentation, and audit trail for all converted workflows. Required Skills: Bachelor s or master s degree in computer science, Information Technology, Data Science, or a related field. Must have aleast 4 years of hands-on experience in designing and developing scalable data solutions using the Snowflake Data Cloud platform Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. 1+ years of experience with Alteryx Designer, including advanced workflow development and debugging. Strong proficiency in SQL, with 3+ years specifically working with Snowflake or other cloud data warehouses. Python programming experience focused on data engineering. Experience with data APIs , batch/stream processing. Solid understanding of data transformation logic like joins, unions, filters, formulas, aggregations, pivots, and transpositions. Experience in performance tuning and optimization of SQL queries in Snowflake. Familiarity with Snowflake features like CTEs, Window Functions, Tasks, Streams, Stages, and External Tables. Exposure to migration or modernization projects from ETL tools (like Alteryx/Informatica) to SQL-based cloud platforms. Strong documentation skills and attention to detail. Experience working in Agile/Scrum development environments. Good communication and collaboration skills.
Posted 3 weeks ago
2.0 - 5.0 years
2 - 6 Lacs
Bengaluru
Work from Office
Job description Job TitleETL Tester Job Responsibilities: Design and execute test cases for ETL processes to validate data accuracy and integrity. Collaborate with data engineers and developers to understand ETL workflows and data transformations. Use Tableau to create visualizations and dashboards that help in data analysis and reporting. Work with Snowflake to test and validate data stored in the cloud data warehouse. Identify, document, and track defects and issues in the ETL process. Perform data profiling and data quality assessments. Create and maintain test documentation, including test plans, test scripts, and test results Exposure to Salesforce and proficiency in developing SQL queries The ideal candidate will have a strong background in ETL processes, data validation, and experience with Tableau and Snowflake. You will be responsible for ensuring the quality and accuracy of data as it moves through the ETL pipeline.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
India, Bengaluru
Work from Office
Senior Data Engineer India, Bengaluru Get to know Okta Okta is The World s Identity Company. We free everyone to safely use any technology anywhere, on any device or app. Our Workforce and Customer Identity Clouds enable secure yet flexible access, authentication, and automation that transforms how people move through the digital world, putting Identity at the heart of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We re building a world where Identity belongs to you. Senior Data Engineer - Enterprise Data Platform Get to know Data Engineering Okta s Business Operations team is on a mission to accelerate Okta s scale and growth. We bring world-class business acumen and technology expertise to every interaction. We also drive cross-functional collaboration and are focused on delivering measurable business outcomes. Business Operations strives to deliver amazing technology experiences for our employees, and ensure that our offices have all the technology that is needed for the future of work. The Data Engineering team is focused on building platforms and capabilities that are utilized across the organization by sales, marketing, engineering, finance, product, and operations. The ideal candidate will have a strong engineering background with the ability to tie engineering initiatives to business impact. You will be part of a team doing detailed technical designs, development, and implementation of applications using cutting-edge technology stacks. The Senior Data Engineer Opportunity A Senior Data Engineer is responsible for designing, building, and maintaining scalable solutions. This role involves collaborating with data engineers, analysts, scientists and other engineers to ensure data availability, integrity, and security. The ideal candidate will have a strong background in cloud platforms, data warehousing, infrastructure as code, and continuous integration/continuous deployment (CI/CD) practices. What you ll be doing: Design, develop, and maintain scalable data platforms using AWS, Snowflake, dbt, and Databricks. Use Terraform to manage infrastructure as code, ensuring consistent and reproducible environments. Develop and maintain CI/CD pipelines for data platform applications using GitHub and GitLab. Troubleshoot and resolve issues related to data infrastructure and workflows. Containerize applications and services using Docker to ensure portability and scalability. Conduct vulnerability scans and apply necessary patches to ensure the security and integrity of the data platform. Work with data engineers to design and implement Secure Development Lifecycle practices and security tooling (DAST, SAST, SCA, Secret Scanning) into automated CI/CD pipelines. Ensure data security and compliance with industry standards and regulations. Stay updated with the latest trends and technologies in data engineering and cloud platforms. What we are looking for: BS in Computer Science, Engineering or another quantitative field of study 5+ years in a data engineering role 5+ years experience working with SQL, ETL tools such as Airflow and dbt, with relational and columnar MPP databases like Snowflake or Redshift, hands-on experience with AWS (e.g., S3, Lambda, EMR, EC2, EKS) 2+ years of experience managing CI/CD infrastructures, with strong proficiency in tools like GitHub Actions, Jenkins, ArgoCD, GitLab, or any CI/CD tool to streamline deployment pipelines and ensure efficient software delivery. 2+ years of experience with Java, Python, Go, or similar backend languages. Experience with Terraform for infrastructure as code. Experience with Docker and containerization technologies. Experience working with lakehouse architectures such as Databricks and file formats like Iceberg and Delta Experience in designing, building, and managing complex deployment pipelines. "This role requires in-person onboarding and travel to our Bengaluru, IN office during the first week of employment." What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Developing Talent and Fostering Connection + Community at Okta Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live. Find your place at Okta today! https://www.okta.com/company/careers/ . Some roles may require travel to one of our office locations for in-person onboarding. Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws. If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/ . U.S. Equal Opportunity Employment Information Read more Individuals seeking employment at this company are considered without regards to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, gender identity, or sexual orientation. When submitting your application above, you are being given the opportunity to provide information about your race/ethnicity, gender, and veteran status. Completion of the form is entirely voluntary . Whatever your decision, it will not be considered in the hiring process or thereafter. Any information that you do provide will be recorded and maintained in a confidential file. If you believe you belong to any of the categories of protected veterans listed below, please indicate by making the appropriate selection. As a government contractor subject to Vietnam Era Veterans Readjustment Assistance Act (VEVRAA), we request this information in order to measure the effectiveness of the outreach and positive recruitment efforts we undertake pursuant to VEVRAA. Classification of protected categories is as follows: A "disabled veteran" is one of the followinga veteran of the U.S. military, ground, naval or air service who is entitled to compensation (or who but for the receipt of military retired pay would be entitled to compensation) under laws administered by the Secretary of Veterans Affairs; or a person who was discharged or released from active duty because of a service-connected disability. A "recently separated veteran" means any veteran during the three-year period beginning on the date of such veteran's discharge or release from active duty in the U.S. military, ground, naval, or air service. An "active duty wartime or campaign badge veteran" means a veteran who served on active duty in the U.S. military, ground, naval or air service during a war, or in a campaign or expedition for which a campaign badge has been authorized under the laws administered by the Department of Defense. An "Armed forces service medal veteran" means a veteran who, while serving on active duty in the U.S. military, ground, naval or air service, participated in a United States military operation for which an Armed Forces service medal was awarded pursuant to Executive Order 12985. Pay Transparency Okta complies with all applicable federal, state, and local pay transparency rules. For additional information about the federal requirements, click here . Voluntary Self-Identification of Disability Form CC-305 Page 1 of 1 OMB Control Number 1250-0005 Expires 04/30/2026 Why are you being asked to complete this form We are a federal contractor or subcontractor. The law requires us to provide equal employment opportunity to qualified people with disabilities. We have a goal of having at least 7% of our workers as people with disabilities. The law says we must measure our progress towards this goal. To do this, we must ask applicants and employees if they have a disability or have ever had one. People can become disabled, so we need to ask this question at least every five years. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor's Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/ofccp. Completing this form is voluntary, and we hope that you will choose to do so. Your answer is confidential. No one who makes hiring decisions will see it. Your decision to complete the form and your answer will not harm you in any way. If you want to learn more about the law or this form, visit the U.S. Department of Labor s Office of Federal Contract Compliance Programs (OFCCP) website at www.dol.gov/agencies/ofccp . How do you know if you have a disability A disability is a condition that substantially limits one or more of your major life activities. If you have or have ever had such a condition, you are a person with a disability. Disabilities include, but are not limited to: Alcohol or other substance use disorder (not currently using drugs illegally) Autoimmune disorder, for example, lupus, fibromyalgia, rheumatoid arthritis, HIV/AIDS Blind or low vision Cancer (past or present) Cardiovascular or heart disease Celiac disease Cerebral palsy Deaf or serious difficulty hearing Diabetes Disfigurement, for example, disfigurement caused by burns, wounds, accidents, or congenital disorders Epilepsy or other seizure disorder Gastrointestinal disorders, for example, Crohn's Disease, irritable bowel syndrome Intellectual or developmental disability Mental health conditions, for example, depression, bipolar disorder, anxiety disorder, schizophrenia, PTSD Missing limbs or partially missing limbs Mobility impairment, benefiting from the use of a wheelchair, scooter, walker, leg brace(s) and/or other supports Nervous system condition, for example, migraine headaches, Parkinson s disease, multiple sclerosis (MS) Neurodivergence, for example, attention-deficit/hyperactivity disorder (ADHD), autism spectrum disorder, dyslexia, dyspraxia, other learning disabilities Partial or complete paralysis (any cause) Pulmonary or respiratory conditions, for example, tuberculosis, asthma, emphysema Short stature (dwarfism) Traumatic brain injury PUBLIC BURDEN STATEMENTAccording to the Paperwork Reduction Act of 1995 no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. This survey should take about 5 minutes to complete. Okta The foundation for secure connections between people and technology Okta is the leading independent provider of identity for the enterprise. The Okta Identity Cloud enables organizations to securely connect the right people to the right technologies at the right time. With over 7,000 pre-built integrations to applications and infrastructure providers, Okta customers can easily and securely use the best technologies for their business. More than 19,300 organizations, including JetBlue, Nordstrom, Slack, T-Mobile, Takeda, Teach for America, and Twilio, trust Okta to help protect the identities of their workforces and customers. Follow Okta Apply
Posted 3 weeks ago
2.0 - 7.0 years
11 - 15 Lacs
Bengaluru
Work from Office
WFM Lead Specialist About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up . Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. The Target Enterprise Services (TES) organization is close to the action when it comes to communication whether with guests or Target team members. From guest service professionals and product designers to vendor managers and financial and workforce management analysts, TES comprises several key and high-visibility areas that elevate and nurture Target s distinctive reputation. We cultivate loyalty and satisfaction through exceptional service and support. And we foster a culture of responsive, knowledgeable and committed service from the inside out through enterprise services our people can count on. Beyond our world-class service centers, there are many important challenges to be met in other TES teams like the TES Operations and Product Team, which plays at the intersections of process and technology, and Service Delivery Enablement which develops comprehensive service delivery strategies for our service centers. TES Product Design manages and grows loyalty, frequency and other marketing programs for all Payment Cards (FPSC, Gift Card). Our Bank Program and included credit risk and compliance functions manage Merchandise Finance, Capital Finance, Expense Management and Financial Goals and Forecasts. And the TES Controller heads up TES Accounting and Financial Operations, including Accounting and Control, Retail Bankcard Services, Target Card Services, Fraud Prevention & Dispute Resolution. As a WFM Lead Specialist , you will manage contacts and team member resources for Target service centers. Job duties may change anytime due to business needs As a WFM Lead Specialist, you will manage contacts and team member resources for Target service centers You will use strong critical thinking and decision-making skills to ensure that service level and productivity metrics are achieved You will provide detailed communication, both written and verbal, throughout the day to Workforce Management business partners and Service Center leaders Responsibilities include managing service level performance for all service centers by analyzing Intra-day volume trends, agent skills, and schedule effectiveness to develop strategies to provide top level service You will effectively execute contingency plans in the face of unexpected workflow changes or contact arrival patterns and sharing out Workforce Management system data with integrity and accuracy You will identify and routinely link with Target partners whose activities may impact volume so these can be factored into forecasts to avoid unexpected volume spikes resulting in poor guest service You will collect metrics on the service and staffing on service centers, analyze this data to determine what trends exist, and share meaningful insights with service center partners REPORTING/WORKING RELATIONSHIPS Reports to the Manager WFM TII Close partnerships with TES Service Center, HQ & TII SHIFT REQUIREMENTS: Able to work on holidays, and weekends 45 Hours/Week with any two consecutive weekly offs Rotational Shifts Rotational Shifts 24/7 primarily in evening and night shifts MINIMUM REQUIREMENTS Four year college degree or equivalent with 2+ yrs. Service Center experience Strong critical thinking and decision-making skills Demonstrated ability to work independently, take initiative and handle multiple tasks Strong technical skills, ability to work within multiple systems and proficiency MS Power Point, Advance excel, Data visualization tools Ability to prioritize responsibilities, work under pressure and within time constraints
Posted 3 weeks ago
5.0 - 8.0 years
2 - 5 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2335_JOB Date Opened 01/08/2024 Industry IT Services Job Type Work Experience 5-8 years Job Title Snowflake Developer City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) Snowflake knowledge (Must have) Autonomous person SQL Knowledge (Must have) Data modeling (Must have) Datawarehouse concepts and DW design best practices (Must have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Informatica IDMC (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner). check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 weeks ago
4.0 - 6.0 years
3 - 6 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_1646_JOB Date Opened 14/12/2022 Industry Technology Job Type Work Experience 4-6 years Job Title ETL Developer City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 4 Programming languages/Tools SQL , Datastage , Teradata . Design complex ETL jobs in IBM Datastage to load data into the DWH as per business logic Work experience in Teradata Database as Developer . Understand and analyse ERP reports and document the logic Identify gaps in the existing solutions to accommodate new business processes introduced by the merger Work on designing TAS workflows to replicate data from SAP into the DWH Prepare test cases and technical specifications for the new solutions Interact with other upstream and downstream application teams and EI teams to build robust data transfer mechanisms between various systems.Essential Skills Required Sound interpersonal communication skills Coordinate with customers and Business Analysts to understand business and reporting requirements Support the development of business intelligence standards to meet business goals. Ability to understand Data warehousing concepts and implement reports based on users inputs Area of expertise includes Teradata SQL, DataStage, Teradata , Shell Scripting . Demonstrated focus on driving for results Ability to work with a cross functional teamEmployment Experience Required Minimum 3+ years technical experience in a data warehousing concepts and as ETL developer. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 weeks ago
6.0 - 10.0 years
2 - 5 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_1999_JOB Date Opened 17/06/2023 Industry Technology Job Type Work Experience 6-10 years Job Title ETL Tester City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 1 Create test case documents/plan for testing the Data pipelines. Check the mapping for the fields that support data staging and in data marts & data type constraints of the fields present in snowflake Verify non-null fields are populated. Verify Business requirements and confirm if the correct logic Is implemented in the transformation layer of ETL process. Verify stored procedure calculations and data mappings. Verify data transformations are correct based on the business rules. Verify successful execution of data loading workflows check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2334_JOB Date Opened 01/08/2024 Industry IT Services Job Type Work Experience 5-8 years Job Title Informatica ETL Developer City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Experience 5 + years Location WFH ( should have good internet connection ) INFORMATICA -ETL role: Informatica IDMC (Must have) SQL knowledge (Must have) Datawarehouse concepts and ETL design best practices (Must have) Data modeling (Must have) Snowflake knowledge (Good to have) SAP knowledge (Good to have) SAP functional knowledge (Good to have) Good Communication skills, Team player, self-motivated and work ethics Flexibility in working hours12pm Central time (overlap with US team ) Confidence, proactiveness and demonstrate alternatives to mitigate tools/expertise gaps(fast learner). check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 weeks ago
6.0 - 10.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2470_JOB Date Opened 03/05/2025 Industry IT Services Job Type Work Experience 6-10 years Job Title Sr. Data Engineer City Bangalore South Province Karnataka Country India Postal Code 560050 Number of Positions 1 Were looking for an experienced Senior Data Engineer to lead the design and development of scalable data solutions at our company. The ideal candidate will have extensive hands-on experience in data warehousing, ETL/ELT architecture, and cloud platforms like AWS, Azure, or GCP. You will work closely with both technical and business teams, mentoring engineers while driving data quality, security, and performance optimization. Responsibilities: Lead the design of data warehouses, lakes, and ETL workflows. Collaborate with teams to gather requirements and build scalable solutions. Ensure data governance, security, and optimal performance of systems. Mentor junior engineers and drive end-to-end project delivery.: 6+ years of experience in data engineering, including at least 2 full-cycle datawarehouse projects. Strong skills in SQL, ETL tools (e.g., Pentaho, dbt), and cloud platforms. Expertise in big data tools (e.g., Apache Spark, Kafka). Excellent communication skills and leadership abilities.PreferredExperience with workflow orchestration tools (e.g., Airflow), real-time data,and DataOps practices. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Job Information Job Opening ID ZR_1835_JOB Date Opened 03/04/2023 Industry Technology Job Type Work Experience 5-8 years Job Title SQL Database Developer City Hyderabad Province Telangana Country India Postal Code 500081 Number of Positions 1 Bachelors Degree plus at least 5-7 years of experience with minimum 3+years in SQL development Strong working knowledge on advanced SQL capabilities like Analytics and Windowing function Working knowledge of 3+ years on some RDBMS database is must have Exposure to Shell scripts for invoking the SQL calls Exposure to the ETL tools would be good to have Working knowledge on Snowflake is good to have Location: Hyderabad, Pune, Bangalore check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 weeks ago
10.0 - 12.0 years
4 - 7 Lacs
Mumbai
Work from Office
Job Information Job Opening ID ZR_2020_JOB Date Opened 23/10/2023 Industry Technology Job Type Work Experience 10-12 years Job Title SAP IOS City Mumbai Province Maharashtra Country India Postal Code 400008 Number of Positions 1 LocationsMumbai, Pune, Chennai, Bangalore **Remote opportunity is also available Experience in SAP Technical Development with a focus on SAP MDK, Fiori, and ABAP Experience in SAP Asset Manager, Maintaining and developing Mobile Apps in S4 HANA SAP BTP, Mobile Services, and SAP Mobile Platform (SMP) Native mobile app development for iOS and Android Java, JavaScript, HTML5, and CSS SAP BTP, Mobile Services, and SAP Mobile Platform (SMP) Desirable Technical Skill Having experience in provision custom app access to mobile devices and providing support. Hands-on experience developing in-sync and offline functionalities for mobile apps. Experience in customizing / enhancing standard mobile apps. Good understanding of ODATA and ability to consume complex deep entities and change sets in the front end. Good understanding and experience in APIs and how to effectively use them on the front end. Having hands-on experience in developing intuitive mobile apps check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
Must have skillset Pentaho ETL tool, Expert in Unix shell Scripting Python PL-SQL scripting Added advantage : Hive/ hadoop knowledge Experience on migration projects. Candidates with Java knowledge. Tooling Should have exposure to Control-M , GitHub, JIRA, confluence , CI/CD Pipeline , Jenkins etc
Posted 3 weeks ago
5.0 - 8.0 years
18 - 30 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
We are seeking a Senior Software Engineer with expertise in data integration to join our team focused on the Markit EDM platform. You will play a key role in developing, testing, and supporting EDM functionalities while collaborating closely with cross-functional teams. This role offers the opportunity to lead business-as-usual initiatives and contribute to the delivery of high-quality solutions and problem resolution. If you have strong experience with the Markit EDM platform and SQL and a commitment to supporting clients effectively, we encourage you to apply. Responsibilities Lead business-as-usual initiatives and deliver high-quality support to business and technology teams Manage a team of 8-10 members as the EDM lead Analyze and resolve complex issues with strong problem-solving skills Maintain strict adherence to issue tracking and incident management processes Support production and test environments to ensure stability and performance Communicate effectively with clients, business users, and internal teams to facilitate collaboration Deliver assigned work with a focus on quality and timely completion Provide regular project progress updates to management and project teams Requirements Proven experience as a software engineer with strong data integration skills using the Markit EDM platform and SQL Background in leading teams or initiatives in a technical environment Experience with Markit EDM and EDM implementations covering any of the following data sets: Market, Instrument, Party, Reference, Corporate Actions and Historical Time-Series Skills in application support, including first and second-line problem resolution Competency in incident and problem management Strong communication skills with English proficiency at C1 level or above Demonstrated ability to work collaboratively with business users and cross-functional teams
Posted 3 weeks ago
5.0 - 10.0 years
18 - 30 Lacs
Hyderabad
Hybrid
We are seeking a skilled and experienced Collibra Developer to support and enhance our data governance and metadata management capabilities. The ideal candidate will be responsible for designing, developing, implementing, and maintaining Collibra solutions, integrating with various enterprise data systems, and ensuring alignment with data governance standards and business requirements. Key Responsibilities: Design and configure Collibra Data Intelligence Cloud solutions (Data Catalog, Data Governance, Lineage, Privacy). Develop and maintain workflows using Collibra Workflow Designer (BPMN). Integrate Collibra with enterprise systems (ETL tools, BI tools, data lakes/warehouses) via APIs, JDBC, or other connectors. Define and maintain data domains, data dictionaries, business glossaries, and data stewardship roles. Required Qualifications: Bachelor's degree in Computer Science, Information Systems, or related field. 5-7+ years of experience working with Collibra Data Intelligence Platform. Hands-on experience in Collibra Administration, Workflow Development (BPMN), and DGC configuration. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 3 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Gurugram
Work from Office
Project Role Application Lead Project Role Description Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills Informatica MDM Good to have skills NA Minimum 15 year(s) of experience is required Educational Qualification 15 years full time education SummaryAs an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities- Expected to be a SME with deep knowledge and experience.- Should have Influencing and Advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Lead the application development team in designing and implementing solutions.- Provide technical guidance and mentorship to team members.- Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration and master data management concepts.- Experience in designing and implementing MDM solutions.- Knowledge of data quality and governance best practices.- Hands-on experience with Informatica MDM tools and technologies. Additional Information- The candidate should have a minimum of 15 years of experience in Informatica MDM.- This position is based at our Gurugram office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 3 weeks ago
12.0 - 17.0 years
10 - 14 Lacs
Gurugram
Work from Office
Project Role Application Lead Project Role Description Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills Informatica MDM Good to have skills NA Minimum 12 year(s) of experience is required Educational Qualification 15 years full time education SummaryAs an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication among team members. Roles & Responsibilities- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Ensure seamless communication among team members- Provide guidance and support to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM- Strong understanding of data management principles- Experience in data modeling and data integration- Knowledge of data quality and governance best practices- Hands-on experience in Informatica MDM implementation Additional Information- The candidate should have a minimum of 12 years of experience in Informatica MDM- This position is based at our Gurugram office- A 15 years full-time education is required Qualification 15 years full time education
Posted 3 weeks ago
7.0 - 12.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education RoleTechnology Support Job TitleSenior DataStage Consultant Career Level08 Must have skills:Apache, Tomcat, IIS, IBM WebSphere Administration, IBM DataStage, Linux shell scripting Good to have skills:Teradata, RDBMS and SQL experience in Oracle, DB2 About The Role : Lead the effort to design, build and configure applications, acting as the primary point of contact.Job Summary The ETL Developer is responsible for L2-L3 production support , administration of DataStage Application, and designing, building, deploying and maintaining ETL DataStage interfaces using the IBM InfoSphere DataStage ETL development tool.Key Responsibilities1. Extend Production Support in L2 / L3 capacity. Being able to efficiently debug and troubleshoot production issues2. Evaluate existing data solutions, write scalable ETLs, develop documentation, and train/help team members.3. Collaborate and work with business/development teams and infrastructure teams on L3 issues and follow the task to completion.4. Participate & provide support for releases, risks, mitigation plan, and regular DR exercise for project roll out.5. Drive Automation, Permanent Fixes to prevent issues from reoccurring.6. Manage Service Level Agreements7. Bring continuous improvements to reduce time to resolve for production incidents8. Perform root cause analysis and identify and implement corrective and preventive measures9. Document standards, processes and procedures relating to best practices, issues and resolutions10. Constantly upskill with tools & technologies to meet organization's future needs11. Be available on call (on rotation) in a support role12. Effectively manage multiple, competing prioritiesTechnical Responsibilities: Excellent Understanding of technical concepts. Strong understanding of OS related dependencies Strong exposure to the shell scripting. Having expertise in any Cloud and Middleware technologies would be a great value add on Professional Attributes: Good verbal and written communication skills to connect with customers at varying levels of the organization Ability to operate independently and make decisions with little direct supervision Candidate must be willing to cross skill and up skill based on project and business requirements.Education Qualification: Higher Level Qualification in a technical subject is desirable IBM DataStage certification. Additional Information:A:Strong written & oral communication skills.B:Should be open to work in shifts. Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Teradata BI Minimum 5 year(s) of experience is required Educational Qualification : minimum 15 years Full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead application development projects Conduct code reviews and ensure coding standards are met Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery Strong understanding of data warehousing concepts Experience with cloud-based data platforms Hands-on experience in SQL and database management Good To Have Skills:Experience with Teradata BI Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery This position is based at our Mumbai office A minimum of 15 years Full time education is required Qualifications minimum 15 years Full time education
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane