Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
• 7-9 years of experience with data analytics, data modeling, and database design. • 3+ years of coding and scripting (Python, Java, Scala) and design experience. • 3+ years of experience with Spark framework. • 5+ Experience with ELT methodologies and tools. • 5+ years mastery in designing, developing, tuning and troubleshooting SQL. • Knowledge of Informatica Power center and Informatica IDMC. • Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. • Strong data analysis skills for extracting insights from financial data • Proficiency in reporting tools (e.g., Power BI, Tableau). The Ideal Qualifications Technical Skills: • Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. • Familiarity with regulatory requirements and compliance standards in the investment management industry. • Experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. • Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion. Soft Skills: • Strong analytical and problem-solving abilities. • Exceptional communication and interpersonal skills. • Ability to influence and motivate teams without direct authority. • Excellent time management and organizational skills, with the ability to prioritize multiple initiatives. What to Expect as Part of our Team • Regular meetings with the Corporate Technology leadership team • Focused one-on-one meetings with your manager • Access to mentorship opportunities • Access to learning content on Degreed and other informational platforms
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
rajasthan
On-site
You will be working as a Snowflake Database Administrator at the Mid Level, providing database and application administration and support for the Information Management Analytical Service. This role involves managing data integration, data warehouse, and business intelligence, including enterprise reporting, predictive analytics, data mining, and self-service solutions. You will collaborate with different teams to offer database and application administration, job scheduling/execution, and code deployment support. Your key responsibilities will include providing database support for Big Data tools, performing maintenance tasks, performance tuning, monitoring, developer support, and administrative support for the application toolset. You will participate in a 24/7 on-call rotation for enterprise job scheduler activities, follow ITIL processes, create/update technical documentation, install/upgrade/configure application toolset, and ensure regular attendance. To qualify for this role, you are required to have a Bachelor's degree or equivalent experience, along with 5 years of work experience in IT. You should have experience in Cloud Database Administration, installing/configuring commercial applications at the OS level, and effective collaboration in a team environment. Preferred skills include scripting in Linux and Windows, experience with Terraform, and knowledge of the insurance and/or reinsurance industry. In terms of technical requirements, you should be proficient in databases such as Snowflake, Vertica, Impala, PostgreSQL, Oracle, SQL Server, operating systems like Unix, Linux, CentOS, Windows, and reporting tools including SAP Business Objects, Tableau, and PowerBI. This position falls under SOW#23 - Snowflake DBA and requires a minimum of 4 years to a maximum of 5 years of experience. Thank you for considering this opportunity.,
Posted 4 days ago
5.0 - 10.0 years
0 - 2 Lacs
Hyderabad
Hybrid
Role & responsibilities 5+ Years of Extensive Experience working with multiple Databases, ETL and BI testing Working experience in Investment Management and Capital Markets Domain with is preferred Experience working in the applications like Eagle, Calypso, Murex will be added advantage Experience in delivering large releases to the customer through direct and partner teams. Experience in testing data validation scenarios and Data ingestion, pipelines, and transformation processes. Experience in Vertica, DataStage, Teradata, and Big Data environments for both data Ingestion and Consumption. Extensive knowledge on any Business Intelligence tool, Preferably MicroStrategy and Tableau. Extensive experience in writing and troubleshooting complex SQL Queries. Expert in providing QA solutions based on Data Warehousing and Dimensional Modelling design. Expert in drafting ETL Source to Target Mapping document design. Identify data validation tools that will suit the ETL project conditions. Ensure all sign offs on deliverables (overall test strategy, test plan, test cases and test results) and that testing meets governance requirements. Establishing and driving Automation Capabilities. Collaborating with dev & architect teams to identify and prioritize opportunities for automation. Experience in ETL automation with open-source tools, Service Virtualization, CI/CD.
Posted 6 days ago
5.0 - 9.0 years
14 - 24 Lacs
Hyderabad
Hybrid
Experience:- Required: Bachelors degree in computer science or engineering. 7+ years of experience with data analytics, data modeling, and database design. 5+ years of experience with Vertica. 2+ years of coding and scripting (Python, Java, Scala) and design experience. 2+ years of experience with Airflow. Experience with ELT methodologies and tools. Experience with GitHub. Expertise in tuning and troubleshooting SQL. Strong data integrity, analytical and multitasking skills. Excellent communication, problem solving, organizational and analytical skills. Able to work independently. Additional / pAdditional/preferred skills: Familiar with agile project delivery process. Knowledge of SQL and use in data access and analysis. Ability to manage diverse projects impacting multiple roles and processes. Able to troubleshoot problem areas and identify data gaps and isissuessuein s. Ability to adapt to fast changing environment. Experience designing and implementing automated ETL processes. Experience with MicroStrategy reporting tool.
Posted 1 week ago
7.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role:-Data Analyst Exp:- 6-11 Yrs Location:-Hyderabad Primary Skills:- ETL,Informatica,Python, SQL,BI tools and Investment Domain Please share your resumes to rajamahender.n@technogenindia.com , Job Description:- The Minimum Qualifications Education: Bachelor’s or Master’s degree in Data Science, Statistics, Mathematics, Computer Science, Actuarial Science, or related field. Experience: 7-9 years of experience as a Data Analyst, with at least 5 years supporting Finance within the insurance industry. Hands-on experience with Vertica/Teradata for querying, performance optimization, and large-scale data analysis. Advanced SQL skills: proficiency in Python is a strong plus. Proven ability to write detailed source-to-target mapping documents and collaborate with technical teams on data integration. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modelling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Investment Principles: Knowledge of different asset classes, investment strategies, and financial markets. Quantitative Finance: Understanding of financial modelling, risk management, and derivatives. Regulatory Framework: Awareness of relevant financial regulations and compliance requirements. The Ideal Qualifications Technical Skills: Proven track record of Analytical and Problem-Solving skills. A solid understanding of Financial Accounting Systems and knowledge of accounting principles, reporting and budgeting Strong data analysis skills for extracting insights from financial data Proficiency in data visualization tools and reporting software is also important. Experience integrating financial systems with actuarial, policy administration, and claims platforms. Familiarity with actuarial processes, reinsurance, or regulatory reporting requirements. Experience with General Ledger systems such as SAP and forecasting tools like Anaplan. Soft Skills: Exceptional communication and interpersonal skills. Ability to influence and motivate teams without direct authority. Excellent time management and organizational skills, with the ability to prioritize multiple initiatives. What to Expect as Part of MassMutual and the Team Regular meetings with the Corporate Technology leadership team Focused one-on-one meetings with your manager Access to mentorship opportunities Access to learning content on Degreed and other informational platforms Your ethics and integrity will be valued by a company with a strong and stable ethical business with industry leading pay and benefits
Posted 1 week ago
0 years
0 Lacs
New Delhi, Delhi, India
On-site
The purpose of this role is to oversee the development of our database marketing solutions, using database technologies such as Microsoft SQL Server/Azure, Amazon Redshift, Google BigQuery. The role will be involved in design, development, troubleshooting, and issue resolution. The role involves upgrading, enhancing, and optimizing the technical solution. It involves continuous integration and continuous deployment of various requirements changes in the business logic implementation. Interactions with internal stakeholders and/or clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design/solution to meet their needs. The ability to communicate to both technical and non-technical audiences is key. Job Description: Must Have Skills: Database (SQL server / Snowflake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc. ETL (Extract, Transform, Load) tool (Talend, Informatica, SSIS, DataStage, Matillion) Python, UNIX shell scripting, Project & resource management Workflow Orchestration (Tivoli, Tidal, Stonebranch) Client-facing skills Good to have Skills: Experience in Cloud computing (one or more of AWS, Azure, GCP) . AWS Preferred. Key responsibilities: Understanding and practical knowledge of data warehouse, data mart, data modelling, data structures, databases, and data ingestion and transformation Strong understanding of ETL processes as well as database skills and common IT offerings i.e. storage, backups and operating system. Has a strong understanding of the SQL and data base programming language Has strong knowledge of development methodologies and tools Contribute to design and oversees code reviews for compliance with development standards Designs and implements technical vision for existing clients Able to convert documented requirements into technical solutions and implement the same in given timeline with quality issues. Able to quickly identify solutions for production failures and fix them. Document project architecture, explain detailed design to team and create low level to high level design. Perform mid to complex level tasks independently. Support Client, Data Scientists and Analytical Consultants working on marketing solution. Work with cross functional internal team and external clients . Strong project Management and organization skills . Ability to lead/work 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review and deployments Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Experience- 4-6Yrs Location ( Mumbai- Thane) Only Immediate joiners Key Responsibilities Database Engineering & Operations Own and manage critical components of the database infrastructure across production and non-production environments. Ensure performance, availability, scalability, and reliability of databases including PostgreSQL, MySQL, and MongoDB Drive implementation of best practices in schema design, indexing, query optimization, and database tuning. Take initiative in root cause analysis and resolution of complex performance and availability issues. Implement and maintain backup, recovery, and disaster recovery procedures; contribute to testing and continuous improvement of these systems. Ensure system health through robust monitoring, alerting, and observability using tools such as Prometheus, Grafana, and CloudWatch. Implement and improve automation for provisioning, scaling, maintenance, and monitoring tasks using scripting (e.g., Python, Bash). Database Security & Compliance Enforce database security best practices, including encryption at-rest and in-transit, IAM/RBAC, and audit logging. Support data governance and compliance efforts related to SOC 2, ISO 27001, or other regulatory standards. Collaborate with the security team on regular vulnerability assessments and hardening initiatives. DevOps & Collaboration Partner with DevOps and Engineering teams to integrate database operations into CI/CD pipelines using tools like Liquibase, Flyway, or custom scripting. Participate in infrastructure-as-code workflows (e.g., Terraform) for consistent and scalable DB provisioning and configuration. Proactively contribute to cross-functional planning, deployments, and system design sessions with engineering and product teams. Required Skills & Experience 4-6 years of production experience managing relational and NoSQL databases in cloud-native environments (AWS, GCP, or Azure). Proficiency in: Relational Databases: PostgreSQL and/or MySQL NoSQL Databases: MongoDB (exposure to Cassandra or DynamoDB is a plus) Deep hands-on experience in performance tuning, query optimization, and troubleshooting live systems. Strong scripting ability (e.g., Python, Bash) for automation of operational tasks. Experience in implementing monitoring and alerting for distributed systems using Grafana, Prometheus, or equivalent cloud-native tools. Understanding of security and compliance principles and how they apply to data systems. Ability to operate with autonomy while collaborating in fast-paced, cross-functional teams. Strong analytical, problem-solving, and communication skills. Nice to Have (Bonus) Experience with Infrastructure as Code tools (Terraform, Pulumi, etc.) for managing database infrastructure. Familiarity with Kafka, Airflow, or other data pipeline tools. Experience working in multi-region or multi-cloud environments with high availability requirements. Exposure to analytics databases (e.g., Druid, ClickHouse, BigQuery, Vertica Db) or search platforms like Elasticsearch. Participation in on-call rotations and contribution to incident response processes.
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role:-Data Engineer-Investment Exp:-6-12 Yrs Location :- Hyderabad Primary Skills :- ETL, Informatica,SQL,Python and Investment domain Please share your resumes to jyothsna.g@technogenindia.com, Job Description :- •7-9 years of experience with data analytics, data modeling, and database design. •3+ years of coding and scripting (Python, Java, Scala) and design experience. •3+ years of experience with Spark framework. •5+ Experience with ELT methodologies and tools. •5+ years mastery in designing, developing, tuning and troubleshooting SQL. •Knowledge of Informatica Power center and Informatica IDMC. •Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. •Strong data analysis skills for extracting insights from financial data •Proficiency in reporting tools (e.g., Power BI, Tableau). T he Ideal Qualifications Technical Skills: •Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. •Familiarity with regulatory requirements and compliance standards in the investment management industry. •Experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. •Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion. Soft Skills: •Strong analytical and problem-solving abilities. •Exceptional communication and interpersonal skills. •Ability to influence and motivate teams without direct authority. •Excellent time management and organizational skills, with the ability to prioritize multiple initiatives.
Posted 1 week ago
6.0 - 13.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Domain: Investment Banking (Asset management, Wealth management, Capital market, Equity, Fixed income) Experience: 6-13 Years JD: 7-9 years of experience as a Data Analyst, with at least 5 years supporting Investment within the industry. Hands-on experience with Vertica/Teradata for querying, performance optimization, and large-scale data analysis. Advanced SQL skills: proficiency in Python is a strong plus. Proven ability to write detailed source-to-target mapping documents and collaborate with technical teams on data integration. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modeling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Investment Principles: Knowledge of different asset classes, investment strategies, and financial markets. Quantitative Finance: Understanding of financial modeling, risk management, and derivatives. Regulatory Framework: Awareness of relevant financial regulations and compliance requirements.
Posted 1 week ago
5.0 years
1 - 10 Lacs
Hyderābād
On-site
AI-First. Future-Driven. Human-Centered. At OpenText, AI is at the heart of everything we do—powering innovation, transforming work, and empowering digital knowledge workers. We're hiring talent that AI can't replace to help us shape the future of information management. Join us. YOUR IMPACT: The OpenText Vertica provides a state-of-the-art Big Data Analytics platform that handles petabytes of data. It is a commercially successful, high performance, distributed database. Every industry is finding ways to benefit from data analytics. We continue to engineer our product to be flexible so that it supports all of them. Vertica is a recognized leader in analytics powering some of the world’s most data driven organizations like Uber, Wayfair, Intuit, Cerner, and more. Our columnar, MPP, distributed database delivers unprecedented speed, petabyte scale, with analytics and machine learning functions built into the core WHAT THE ROLE OFFERS: Produce high quality code according to design specifications. Detailed technical design of highly complex software components. Analyse, troubleshoot, and fix highly complex code defects. Propose creative solutions or alternatives balancing risk, complexity, and effort to meet requirements. Lead software design/code reviews to ensure quality and adherence to company standards. Mentor other team members. Work across teams and functional roles to ensure interoperability among other products, including training and consultation. Participate in the software development process from design to release in an Agile Development Framework. What you will need to succeed Bachelor’s degree in computer science or related field 5+ years of product development experience Strong proficiency in C++ Good knowledge of working with any database and SQL. Thorough knowledge of the standard library, STL containers, and algorithms. Good understanding of memory management. Good understanding of Linux based OS and application development for the same. Hands on experience on multithreading. Strong knowledge on building distributed applications. AWS and Kubernetes knowledge will be an advantage. OpenText's efforts to build an inclusive work environment go beyond simply complying with applicable laws. Our Employment Equity and Diversity Policy provides direction on maintaining a working environment that is inclusive of everyone, regardless of culture, national origin, race, color, gender, gender identification, sexual orientation, family status, age, veteran status, disability, religion, or other basis protected by applicable laws. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please submit a ticket atAsk HR. Our proactive approach fosters collaboration, innovation, and personal growth, enriching OpenText's vibrant workplace.
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The Impact: Design, build, and measure complex ELT jobs to process disparate data sources and form a high integrity, high quality, clean data asset. Executes and provides feedback for data modeling policies, procedure, processes, and standards. Assists with capturing and documenting system flow and other pertinent technical information about data, database design, and systems. Develop comprehensive data quality standards and implement effective tools to ensure data accuracy and reliability. Collaborate with various Investment Management departments to gain a better understanding of new data patterns. Collaborate with Data Analysts, Data Architects, and BI developers to ensure design and development of scalable data solutions aligning with business goals. Translate high-level business requirements into detailed technical specs. The Minimum Qualifications Education: Bachelor’s or Master’s degree in Computer Science, Information Systems or related field. Experience: 7-9 years of experience with data analytics, data modeling, and database design. 3+ years of coding and scripting (Python, Java, Scala) and design experience. 3+ years of experience with Spark framework. 5+ Experience with ELT methodologies and tools. 5+ years mastery in designing, developing, tuning and troubleshooting SQL. Knowledge of Informatica Power center and Informatica IDMC. Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. Strong data analysis skills for extracting insights from financial data Proficiency in reporting tools (e.g., Power BI, Tableau).
Posted 1 week ago
3.0 years
4 - 10 Lacs
Gurgaon
On-site
What You'll Do: Criteo is in search of a passionate, highly motivated Data Analyst to join our Analytics team. You will turn business requests into data problems and tackle them in a scalable and efficient way, working together with analyst teams across Criteo locations. Aside from solving business challenges, this position also involves technically rigorous work, including the use of SQL, Excel, Hive, Python, and other leading-edge data tools. We are looking for a team player who is both business-driven and highly analytical. He or she will work with cross-functional business units to perform back-office data analysis and reporting that doesn’t require market context nor interaction with final customers. The ideal candidate will be able to take a recurrent business need and look for ways to address it in an automated and scalable way, both through process optimization and creation of dedicated tools. This role supports our EMEA business and work hours will be between 12.30pm IST – 9.30pm IST. This role is based in Gurgaon, India. Develop & share - deep knowledge of Criteo’s technology, products, and position in the marketplace. Provide actionable insights & create best practices to solve operational problems and actively look for opportunities for scaling analysis and tools across different business units. Leverage Python and SQL to answer commercial requests. Own and maintain reports/tools in Tableau, Python and other data tools. Conduct back-office ad-hoc analysis, problem-solving, and troubleshooting along with Root Cause Analysis. Automate the persistent tasks to enhance efficiency and reduce delivery times. Collaborate with teams based in other countries to support their analytical needs. Who You Are: Bachelor’s degree or higher in a quantitative/business field (Mathematics, Statistics, Engineering, Economics, Business, Finance, etc.). At least 3+ years of work experience in business / data analytics role. preferably from consulting, product-tech, retail, e-commerce background. Strong intellectual curiosity and ability to structure and solve difficult problems with minimal supervision. Excellent technical skills: strong SQL, basic Python, and visualization are a must. Effective business acumen & client engaging skills to provide clear actionable insights. Experience in any of the following is a plus: Excel, Tableau, Hive/Hadoop, Vertica, Git/Gerrit. Knowledge in agency or digital marketing is a plus. We acknowledge that many candidates may not meet every single role requirement listed above. If your experience looks a little different from our requirements but you believe that you can still bring value to the role, we’d love to see your application! Who We Are: Criteo is the global commerce media company that enables marketers and media owners to deliver richer consumer experiences and drive better commerce outcomes through its industry leading Commerce Media Platform. At Criteo, our culture is as unique as it is diverse. From our offices around the world or from home, our incredible team of 3,600 Criteos collaborates to develop an open and inclusive environment. We seek to ensure that all of our workers are treated equally, and we do not tolerate discrimination based on race, gender identity, gender, sexual orientation, color, national origin, religion, age, disability, political opinion, pregnancy, migrant status, ethnicity, marital or family status, or other protected characteristics at all stages of the employment lifecycle including how we attract and recruit, through promotions, pay decisions, benefits, career progression and development. We aim to ensure employment decisions and actions are based solely on business-related considerations and not on protected characteristics. As outlined in our Code of Business Conduct and Ethics, we strictly forbid any kind of discrimination, harassment, mistreatment or bullying towards colleagues, clients, suppliers, stakeholders, shareholders, or any visitors of Criteo. All of this supports us in our mission to power the world’s marketers with trusted and impactful advertising encouraging discovery, innovation and choice in an open internet. Why Join Us: At Criteo, we take pride in being a caring culture and are committed to providing our employees with valuable benefits that support their physical, emotional and financial wellbeing, their interests and the important life events. We aim to create a place where people can grow and learn from each other while having a meaningful impact. We want to set you up for success in your job, and an important part of that includes comprehensive perks & benefits. Benefits may vary depending on the country where you work and the nature of your employment with Criteo. When determining compensation, we carefully consider a wide range of job-related factors, including experience, knowledge, skills, education, and location. These factors can cause your compensation to vary.
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
This job is with Criteo, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. What You'll Do Criteo is in search of a passionate, highly motivated Data Analyst to join our Analytics team. You will turn business requests into data problems and tackle them in a scalable and efficient way, working together with analyst teams across Criteo locations. Aside from solving business challenges, this position also involves technically rigorous work, including the use of SQL, Excel, Hive, Python, and other leading-edge data tools. We are looking for a team player who is both business-driven and highly analytical. He or she will work with cross-functional business units to perform back-office data analysis and reporting that doesn't require market context nor interaction with final customers. The ideal candidate will be able to take a recurrent business need and look for ways to address it in an automated and scalable way, both through process optimization and creation of dedicated tools. This role supports our EMEA business and work hours will be between 12.30pm IST - 9.30pm IST. This role is based in Gurgaon, India. Develop & share - deep knowledge of Criteo's technology, products, and position in the marketplace. Provide actionable insights & create best practices to solve operational problems and actively look for opportunities for scaling analysis and tools across different business units. Leverage Python and SQL to answer commercial requests. Own and maintain reports/tools in Tableau, Python and other data tools. Conduct back-office ad-hoc analysis, problem-solving, and troubleshooting along with Root Cause Analysis. Automate the persistent tasks to enhance efficiency and reduce delivery times. Collaborate with teams based in other countries to support their analytical needs. Who You Are Bachelor's degree or higher in a quantitative/business field (Mathematics, Statistics, Engineering, Economics, Business, Finance, etc.). At least 3+ years of work experience in business / data analytics role. preferably from consulting, product-tech, retail, e-commerce background. Strong intellectual curiosity and ability to structure and solve difficult problems with minimal supervision. Excellent technical skills: strong SQL, basic Python, and visualization are a must. Effective business acumen & client engaging skills to provide clear actionable insights. Experience in any of the following is a plus: Excel, Tableau, Hive/Hadoop, Vertica, Git/Gerrit. Knowledge in agency or digital marketing is a plus. We acknowledge that many candidates may not meet every single role requirement listed above. If your experience looks a little different from our requirements but you believe that you can still bring value to the role, we'd love to see your application! Who We Are Criteo is the global commerce media company that enables marketers and media owners to deliver richer consumer experiences and drive better commerce outcomes through its industry leading Commerce Media Platform. 🌟 At Criteo, our culture is as unique as it is diverse. From our offices around the world or from home, our incredible team of 3,600 Criteos collaborates to develop an open and inclusive environment. We seek to ensure that all of our workers are treated equally, and we do not tolerate discrimination based on race, gender identity, gender, sexual orientation, color, national origin, religion, age, disability, political opinion, pregnancy, migrant status, ethnicity, marital or family status, or other protected characteristics at all stages of the employment lifecycle including how we attract and recruit, through promotions, pay decisions, benefits, career progression and development. We aim to ensure employment decisions and actions are based solely on business-related considerations and not on protected characteristics. As outlined in our Code of Business Conduct and Ethics, we strictly forbid any kind of discrimination, harassment, mistreatment or bullying towards colleagues, clients, suppliers, stakeholders, shareholders, or any visitors of Criteo. All of this supports us in our mission to power the world's marketers with trusted and impactful advertising encouraging discovery, innovation and choice in an open internet. Why Join Us At Criteo, we take pride in being a caring culture and are committed to providing our employees with valuable benefits that support their physical, emotional and financial wellbeing, their interests and the important life events. We aim to create a place where people can grow and learn from each other while having a meaningful impact. We want to set you up for success in your job, and an important part of that includes comprehensive perks & benefits. Benefits may vary depending on the country where you work and the nature of your employment with Criteo. When determining compensation, we carefully consider a wide range of job-related factors, including experience, knowledge, skills, education, and location. These factors can cause your compensation to vary.
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
indore, madhya pradesh
On-site
You are a skilled Database Engineer responsible for designing, building, and maintaining reliable database systems to support applications and data infrastructure. Your expertise in database architecture, data modeling, and performance tuning, coupled with hands-on experience in SQL and NoSQL systems, is crucial for this role. Your primary responsibilities will include designing and implementing scalable and high-performing database architectures, optimizing complex queries, stored procedures, and indexing strategies, collaborating with backend engineers and data teams to model databases, performing data migrations, transformations, and integrations, and ensuring data consistency, integrity, and availability across distributed systems. You will also develop and maintain ETL pipelines, monitor database performance, automate repetitive tasks, deploy schema changes, and assist with database security practices. To excel in this role, you must have strong experience in relational databases such as PostgreSQL, MySQL, MS SQL Server, or Oracle, proficiency in writing optimized SQL queries, experience with NoSQL databases like MongoDB, Cassandra, DynamoDB, or Redis, a solid understanding of database design principles, and expertise in Oracle and GoldenGate. Additionally, hands-on experience with ETL pipelines, data transformation, scripting, version control systems, DevOps tools, cloud database services, data backup, disaster recovery, and high availability setups is essential. This is a full-time position located in Indore, requiring a minimum of 4 years of relevant experience. If you are passionate about database engineering, data management, and system performance optimization, we encourage you to apply and be part of our dynamic team. Please note that this job description is sourced from hirist.tech.,
Posted 1 week ago
3.0 - 7.0 years
0 - 0 Lacs
pune, maharashtra
On-site
Job Description: As an ETL & BI Testing Specialist at IGS, you will play a crucial role in ensuring the accuracy and reliability of complex data processes and Business Intelligence systems. Your attention to detail and expertise in backend testing will contribute to maintaining data integrity and driving excellence in BI testing. Working in a collaborative and fast-paced environment, you will work closely with cross-functional teams to deliver high-quality results. Your responsibilities will include performing backend testing on intricate ETL workflows and data warehouse systems, validating BI reports and dashboards, and executing advanced SQL queries to support testing requirements. You will be tasked with identifying, troubleshooting, and documenting data issues within large datasets, utilizing tools like JIRA for test management and defect tracking. Your role will involve collaborating within Agile teams to ensure the delivery of high-quality outcomes. Key Skills & Qualifications: - Demonstrated expertise in SQL, with the ability to write and analyze complex queries. - Hands-on experience in ETL/Data Warehouse testing and BI reporting validation. - Proficiency in testing reports created with tools like Tableau or similar platforms. - Familiarity with database systems such as Vertica, Oracle, or Teradata. - Skillful use of test and defect management tools, particularly JIRA. - Sound understanding of SDLC and Agile methodologies. - Strong analytical, problem-solving, and communication skills. What We Offer: - Engage in real-world, large-scale data testing projects. - Exposure to modern BI and ETL ecosystems. - A collaborative culture that fosters innovation and precision. - Opportunities for continuous learning and career growth in data quality and BI testing. If you are passionate about ensuring data accuracy and enjoy unraveling complex data puzzles, we are excited to hear from you.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Support Engineer at Precisely, you will play a crucial role in driving solutions to complex issues and ensuring the success of our customers. Your technical expertise will be essential in supporting Precisely Data Integration investments, including various products and Sterling B2B Integrator. Your problem-solving skills, technical depth, effective communication, and ability to innovate will be key attributes for excelling in this role. In this position, your responsibilities will include providing top-notch technical support via phone, email, and remote desktop connections, meeting SLA requirements, updating stakeholders promptly, and documenting critical information. You will be tasked with swiftly resolving issues to guarantee customer satisfaction, investigating and solving complex problems across different platforms, software systems, and databases. Your understanding of enterprise systems will be pivotal in identifying the root cause of issues and recommending suitable solutions. Continuous learning and knowledge sharing are integral parts of this role. You will be expected to stay updated on new technologies, tools, and systems and share your insights with the team. Developing comprehensive internal and external Knowledge Base documentation will be essential for enhancing customer and team support. Additionally, you will contribute to debugging, suggesting solutions, and tools for product improvements. Requirements for this role include a Bachelor's or Master's degree in Computer Science or a related field, exceptional communication skills, strong analytical abilities, and a self-motivated approach to problem-solving. A keen interest in learning new technologies, understanding software design principles, and proficiency in database management systems and networking design are essential. Experience with debugging, object-oriented languages, distributed computing, and various technologies will be advantageous. If you are enthusiastic about tackling challenging problems, working under tight deadlines, and providing excellent technical support, this role at Precisely offers a rewarding opportunity to grow and contribute to a leading organization in data integrity.,
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
What You'll Do: Criteo is in search of a passionate, highly motivated Data Analyst to join our Analytics team. You will turn business requests into data problems and tackle them in a scalable and efficient way, working together with analyst teams across Criteo locations. Aside from solving business challenges, this position also involves technically rigorous work, including the use of SQL, Excel, Hive, Python, and other leading-edge data tools. We are looking for a team player who is both business-driven and highly analytical. He or she will work with cross-functional business units to perform back-office data analysis and reporting that doesn’t require market context nor interaction with final customers. The ideal candidate will be able to take a recurrent business need and look for ways to address it in an automated and scalable way, both through process optimization and creation of dedicated tools. This role supports our EMEA business and work hours will be between 12.30pm IST – 9.30pm IST. This role is based in Gurgaon, India. Develop & share - deep knowledge of Criteo’s technology, products, and position in the marketplace. Provide actionable insights & create best practices to solve operational problems and actively look for opportunities for scaling analysis and tools across different business units. Leverage Python and SQL to answer commercial requests. Own and maintain reports/tools in Tableau, Python and other data tools. Conduct back-office ad-hoc analysis, problem-solving, and troubleshooting along with Root Cause Analysis. Automate the persistent tasks to enhance efficiency and reduce delivery times. Collaborate with teams based in other countries to support their analytical needs. Who You Are: Bachelor’s degree or higher in a quantitative/business field (Mathematics, Statistics, Engineering, Economics, Business, Finance, etc.). At least 3+ years of work experience in business / data analytics role. preferably from consulting, product-tech, retail, e-commerce background. Strong intellectual curiosity and ability to structure and solve difficult problems with minimal supervision. Excellent technical skills: strong SQL, basic Python, and visualization are a must. Effective business acumen & client engaging skills to provide clear actionable insights. Experience in any of the following is a plus: Excel, Tableau, Hive/Hadoop, Vertica, Git/Gerrit. Knowledge in agency or digital marketing is a plus. We acknowledge that many candidates may not meet every single role requirement listed above. If your experience looks a little different from our requirements but you believe that you can still bring value to the role, we’d love to see your application! Who We Are: Criteo is the global commerce media company that enables marketers and media owners to deliver richer consumer experiences and drive better commerce outcomes through its industry leading Commerce Media Platform. 🌟 At Criteo, our culture is as unique as it is diverse. From our offices around the world or from home, our incredible team of 3,600 Criteos collaborates to develop an open and inclusive environment. We seek to ensure that all of our workers are treated equally, and we do not tolerate discrimination based on race, gender identity, gender, sexual orientation, color, national origin, religion, age, disability, political opinion, pregnancy, migrant status, ethnicity, marital or family status, or other protected characteristics at all stages of the employment lifecycle including how we attract and recruit, through promotions, pay decisions, benefits, career progression and development. We aim to ensure employment decisions and actions are based solely on business-related considerations and not on protected characteristics. As outlined in our Code of Business Conduct and Ethics, we strictly forbid any kind of discrimination, harassment, mistreatment or bullying towards colleagues, clients, suppliers, stakeholders, shareholders, or any visitors of Criteo. All of this supports us in our mission to power the world’s marketers with trusted and impactful advertising encouraging discovery, innovation and choice in an open internet. Why Join Us: At Criteo, we take pride in being a caring culture and are committed to providing our employees with valuable benefits that support their physical, emotional and financial wellbeing, their interests and the important life events. We aim to create a place where people can grow and learn from each other while having a meaningful impact. We want to set you up for success in your job, and an important part of that includes comprehensive perks & benefits. Benefits may vary depending on the country where you work and the nature of your employment with Criteo. When determining compensation, we carefully consider a wide range of job-related factors, including experience, knowledge, skills, education, and location. These factors can cause your compensation to vary.
Posted 1 week ago
5.0 - 10.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Lead Software Engineer Backend Were seeking a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, environmental, and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and onpremises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Djang Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake
Posted 1 week ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Programmatic Support Engineer Product Support Job Description : support team responsible for providing technical assistance to zeta clients and internal business functions. This is a customer-facing role, and requires excellent prioritization, responsiveness, and customer service, along with excellent verbal communication skills. Answering questions from customers about the features and capabilities of our Zeta Application products. Developing customer-facing documentation on using certain features on needed basis. Ensure that end-to-end display campaigns are run effectively, including tagging, trafficking, and optimization. Become a subject matter expert on Programmatic topics such as: platform functionality, campaign best practices, pixel implementation, creative troubleshooting, and more Provide technical support of Programmatic platforms, campaign performance, and external DSP tools Triage support tickets with issue summary, urgency, and next steps when input is needed from backend engineering teams Shift Timings: Night Shift (EST & PST) Education: BSC / BTech / MCA / MSC Must have Skills: Functional Skills and Experiences At least 3+ years experience in 24/7 environment providing technical support Extensive problem solving and debugging skills Excellent interpersonal and communication skills Flexible in working outside of core business hours at short notice Should have excellent written and verbal communication skills Experience of managing customers across locations/ geographies is preferred Deep knowledge of the programmatic ecosystem In-depth understanding of DSPs, programmatic advertising, real-time bidding, and ad operations. Demonstrated analytical ability Experience with troubleshooting ad delivery issues, pixel/tag implementation, and bid optimization. Experience using DSPs including (but not limited to): DoubleClick Bid Manager, The Trade Desk, and AppNexus. Zeta DSP a plus Deep understanding of Ad Tech industry and how Demand-Side Platforms (DSPs), Ad Servers, Attribution Platforms, etc. work in conjunction Technical Skills and Experiences: Strong MySQL/Oracle database with minimum 2 yrs. of work experience involving DB (MySQL, Vertica, HIVE) Good Knowledge with Hands on experience on Linux Operating system Web technologies & Networking Basics Good to Have: Certification in programmatic platforms (e.g., Google Marketing Platform, The Trade Desk).
Posted 1 week ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role:-Data Engineer-Investment Exp:-6-10 Yrs Location :- Hyderabad Primary Skills :- ETL, Informatica,SQL,Python and Investment domain Please share your resumes to jyothsna.g@technogenindia.com, Job Description :- •7-9 years of experience with data analytics, data modeling, and database design. •3+ years of coding and scripting (Python, Java, Scala) and design experience. •3+ years of experience with Spark framework. •5+ Experience with ELT methodologies and tools. •5+ years mastery in designing, developing, tuning and troubleshooting SQL. •Knowledge of Informatica Power center and Informatica IDMC. •Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. •Strong data analysis skills for extracting insights from financial data •Proficiency in reporting tools (e.g., Power BI, Tableau). T he Ideal Qualifications Technical Skills: •Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. •Familiarity with regulatory requirements and compliance standards in the investment management industry. •Experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. •Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion. Soft Skills: •Strong analytical and problem-solving abilities. •Exceptional communication and interpersonal skills. •Ability to influence and motivate teams without direct authority. •Excellent time management and organizational skills, with the ability to prioritize multiple initiatives.
Posted 1 week ago
4.0 - 7.0 years
7 - 11 Lacs
Noida
Work from Office
Design and develop robust data pipelines and ETL processes Support data migration from traditional systems (Oracle, Vertica) to modern cloud platforms (Snowflake, AWS) Build and maintain scalable data ingestion and transformation frameworks Optimize and tune data queries for performance and efficiency Implement and manage cloud-based data solutions (Snowflake, AWS, Azure) Work closely with architects and senior engineers on solution design Participate in Agile development practices (Scrum/Kanban) Conduct unit testing, troubleshoot issues, and ensure data quality Document data models, processes, and best practices Technical Skills: Cloud Platforms: AWS (preferred), Snowflake, Azure (nice to have) Databases: Oracle, SQL Server, Vertica, MongoDB Languages: Python, SQL, Shell scripting (Java is a plus) Tools: Spark (preferred), Tableau, Jenkins, Git, Docker (basic understanding) Frameworks: Familiarity with REST APIs and microservices architecture Data Skills: Data modeling, ETL, Data Migration, Data Quality Agile Methodologies: Experience working in Scrum/Kanban teams Mandatory Competencies Cloud - Cloud - Snowflake Beh - Communication Database - Sql Server - SQL Packages
Posted 1 week ago
1.0 - 6.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Shift Timings: US (EST Shifts) Night Shifts The Programmatic Analytics team at Zeta Global provides full reporting, actionable audience insights and campaign strategy support to both external customers and internal teams. While working with external customers, our analysts partner with the Zeta Sales and Account Management teams to monitor Zetas optimization platform across various verticals, design Real-time Brand Lift surveys and deliver actionable insight presentations for all Programmatic clients. Liaising with internal teams, our Analysts also provide feedback to our Product and Engineering teams on Zetas optimization platform, task automation and A.I. model features to advance the reliability, reach, and effectiveness of programmatic campaigns and to help drive overall company revenue goals. As an Analyst, youll be responsible for a number of tasks, including (but not limited to) compiling campaign performance and audience insights for all executed media, setting up Zeta surveys, assist in monitoring survey performance, help with optimization efforts and help drive campaign performance. Youll work closely with the Sales and Customer Success teams to achieve client goals through brand and acquisition campaigns, campaign optimization, and/or online A/B testing strategy. While every day will offer a different challenge, day-to-day, your role will include: Diving into large campaign data sets, uncovering insights, and providing impactful recommendation for clients through thoughtfully crafted storytelling. Deliverables include monthly campaign reporting, quarterly and campaign wrap reporting via PowerPoint. Set up and monitor Zeta client surveys and Zeta Marketing surveys Work with the Zeta Sales and Account Management teams to ensure surveys are executed successfully Work with internal teams to compile survey insights once surveys are completed Assist in pulling/setting up custom reports, as requested Assist in growing Zetas Programmatic revenue quarter over quarter via campaign support, optimizations and incremental. Who you are: A great communicator, comfortable speaking with clients, team members and C-Level Members alike and can convey complex technical features in simple terms Someone with an aptitude for media and strategy and able to contextually relay concepts to clients Able to multitask and prioritize high-priority requests with specific SLAs Have a high degree of creativity, self-motivation, and drive Eagerness to work in a team environment that will be constantly changing day to day Enthusiastic team player with a penchant for collaboration and knowledge sharing Data driven, technical, self-starting, and curious. What you need: 1 year of working experience in Programmatic, AdTech or MarTech Space o Programmatic Advertising knowledge a plus Experience with SQL query language, Tableau, HIVE, Python, Vertica, PowerPoint and Excel/pivot tables Excellent presentation/visualization/storytelling skills Excellent troubleshooting and diagnostic skills Professional oral and written communication skills Bachelors degree in Media, Business, Economics, Statistics, Marketing and/or equivalent experience. Bonus if you have: Experience in a digital media/Programmatic analytics role Experience pulling data & putting together actionable audience insights reports across different programmatic channels Experience with VBA/Excel Macros, Tableau, Python or other data manipulation tools a plus
Posted 1 week ago
7.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Domain: Insurance & Finance Experience: 7-9 years of experience as a Data Analyst, with at least 5 years supporting Finance within the insurance industry. Hands-on experience with Vertica/Teradata for querying, performance optimization, and large-scale data analysis. Advanced SQL skills: proficiency in Python is a strong plus. Proven ability to write detailed source-to-target mapping documents and collaborate with technical teams on data integration. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modeling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Investment Principles: Knowledge of different asset classes, investment strategies, and financial markets. Quantitative Finance: Understanding of financial modeling, risk management, and derivatives. Regulatory Framework: Awareness of relevant financial regulations and compliance requirements.
Posted 1 week ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
The senior Product Manager holds the responsibility for designing, developing, and overseeing activities related to a specific product or a group of products. This oversight encompasses everything from defining the product and planning its development to production and go-to-market strategies. Additionally, the Product Manager is tasked with crafting the product roadmap necessary to achieve bookings, client NPS, and gross margin targets associated with their component. To facilitate organic growth, the product manager collaborates with internal stakeholders, clients, and prospects to identify new product capability requirements. They maintain close collaboration with their development teams to ensure the successful creation and introduction of these new capabilities to the market. Furthermore, the Product Manager takes charge of testing and implementing these fresh features with clients and actively promotes future growth to a broader audience of Clearwater clients and prospects. Responsibilities: Team Span: responsible for handling a team of 20-50 Developers. Prioritizes decisions across products. Establishes alignment on the product roadmap among multiple development teams. Exerts influence on shaping the company's roadmap. Efficiently leads the development of cross-product capabilities. Contributes to the formulation of the department's development and training plan. Advocates for a culture of communication throughout the organization. Is recognized as an industry expert and frequently represents CW on industry forum panels. Proficiently evaluates opportunities in uncharted territory. Independently identifies, assesses, and potentially manages partnership relationships with external parties. Delivers leadership and expertise to our continually expanding workforce. Required Skills: Domain Knowledge: Strong understanding of the alternative investments ecosystem, including (but not limited to) limited partnerships, mortgage loans, direct loans, private equity, and other non-traditional asset classes. AI / GenAI Exposure (Preferred): Experience in AI or Gen AI-based projects, particularly in building platforms or solutions using Generative AI technologies, will be considered a strong advantage. Proven track record as a Product Manager (Ideal but not vital) that owns all aspects of a successful product throughout its lifecycle in a B2B environment. Knowledge of investments and investment accounting (Very important). Exemplary interpersonal, communication, and project management skills. Excellent team and relationship building abilities, with both internal and external parties (engineers, business stakeholders, partners, etc.). Ability to work well under pressure, multitask, and maintain keen attention to detail. Strong leadership skills, including ability to influence via diplomacy and tact. Experience working with Cloud Platforms (AWS/Azure/GCP). Ability to work with relational and NoSQL databases. Strong computer skills, including proficiency in Microsoft Office. Excellent attention to detail and strong documentation skills. Outstanding verbal and written communication skills. Strong organizational and interpersonal skills. Exceptional problem-solving abilities. Education and Experience: Bachelor's/master's degree in engineering or a related field. 7+ years of relevant experience. Professional experience in building distributed software systems, specializing in big data and NoSQL. database technologies (Hadoop, Spark, DynamoDB, HBase, Hive, Cassandra, Vertica). Experience working with indexing systems such as elastic search, SOLR/Lucene. Experience working with messaging systems such as Kafka/SQS/SNS.
Posted 1 week ago
4.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Job Summary We are looking for a skilled Database Engineer to design, build, and maintain reliable database systems that support our applications and data infrastructure. The ideal candidate will have strong technical expertise in database architecture, data modeling, and performance tuning, along with hands-on experience in both SQL and NoSQL systems. Location : Indore Job Type : Full-Time Experience : 4+ Years Key Responsibilities Design and implement scalable and high-performing database architectures Build and optimize complex queries, stored procedures, and indexing strategies Collaborate with backend engineers and data teams to model and structure databases that meet application requirements Perform data migrations, transformations, and integrations across environments Ensure data consistency, integrity, and availability across distributed systems Develop and maintain ETL pipelines and real-time data flows Monitor database performance and implement tuning improvements Automate repetitive database tasks and deploy schema changes Assist with database security practices and access control policies Support production databases and troubleshoot incidents or outages Required Skills And Qualifications Strong experience in relational databases like PostgreSQL, MySQL, MS SQL Server, or Oracle Proficiency in writing optimized SQL queries and performance tuning Experience with NoSQL databases like MongoDB, Cassandra, DynamoDB, or Redis Solid understanding of database design principles, normalization, and data warehousing Strong expertise in Oracle and GoldenGate. Experience with database platforms such as Vertica, Couchbase Capella, or CockroachDB. Hands-on experience with ETL pipelines, data transformation, and scripting (e.g., Python, Bash) Familiarity with version control systems (e.g., Git) and DevOps tools (e.g., Docker, Kubernetes, Jenkins) Knowledge of cloud database services (e.g., AWS RDS, Google Cloud SQL, Azure SQL Database) Experience with data backup, disaster recovery, and high availability setups (ref:hirist.tech)
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France