Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 years
0 Lacs
India
Remote
Job Title: Senior Solution Architect – Data & Cloud Experience: 12+ Years Location: Hybrid / Remote Employment Type: Full-time About Company: We are a data and analytics firm that provides the strategies, tools, capability and capacity that businesses need to turn their data into a competitive advantage. USEReady partners with cloud and data ecosystem leaders like Tableau, Salesforce, Snowflake, Starburst and Amazon Web Services, and has been named Tableau partner of the year multiple times. Headquartered in NYC, the company has 450 employees across offices in the U.S., Canada, India and Singapore and specializes in financial services. USEReady’s deep analytics expertise, unique player/coach approach and focus on fast results makes the company a perfect partner for a cloud-first, digital world. About the Role: We are looking for a highly experienced Senior Solution Architect to join our Migration Works practice, specializing in modern data platforms and visualization tools. The ideal candidate will bring deep technical expertise in Tableau, Power BI, AWS, and Snowflake, along with strong client-facing skills and the ability to design scalable, high-impact data solutions. You will be at the forefront of driving our AI driven migration and modernization initiatives, working closely with customers to understand their business needs and guiding delivery teams to success. Key Responsibilities: Solution Design & Architecture Lead the end-to-end design of cloud-native data architecture using AWS, Snowflake, and Azure stack. Translate complex business requirements into scalable and efficient technical solutions. Architect modernization strategies for legacy BI systems to cloud-native platforms. Client Engagement Conduct technical discussions with enterprise clients and stakeholders to assess needs and define roadmap. Act as a trusted advisor during pre-sales and delivery phases, showcasing technical leadership and consultative approach. Migration & Modernization Design frameworks for data platform migration (from on-premise to cloud), data warehousing, and analytics transformation. Support estimation, planning, and scoping of migration projects. Team Leadership & Delivery Oversight Guide and mentor delivery teams across geographies, ensuring solution quality and alignment to client goals. Support delivery by providing architectural oversight and resolving design bottlenecks. Conduct technical reviews, define best practices, and uplift the team’s capabilities. Required Skills & Experience: 15+ years of progressive experience in data and analytics, with at least 5 years in solution architecture roles. Strong hands-on expertise in: Tableau And Power BI – dashboard design, visualization architecture, and migration from legacy BI tools. AWS – S3, Redshift, Glue, Lambda, and data pipeline components. Snowflake – Architecture, Snowconvert, data modeling, security, and performance optimization. Experience in migrating legacy platforms (e.g., Cognos, BO, Qlik) to modern BI/Cloud-native stacks like Tableau and Power BI. Proven ability to interface with senior client stakeholders, understand business problems, and propose architectural solutions. Strong leadership, communication, and mentoring skills. Familiarity with data governance, security, and compliance in cloud environments. Preferred Qualifications: AWS/Snowflake certifications are a strong plus. Exposure to data catalog, lineage tools, and metadata management. Knowledge of ETL/ELT tools such as Talend, Informatica, or dbt. Prior experience working in consulting or fast-paced client services environments. What We Offer: Opportunity to work on cutting-edge AI led cloud and data migration projects. A collaborative and high-growth environment with room to shape future strategy. Access to learning programs, certifications, and technical leadership exposure.
Posted 5 days ago
9.0 - 15.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title- Snowflake Data Architect Experience- 9 to 15 Years Location- Gurugram Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures. Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion, or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake. Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema, normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake: Architecture design, performance tuning, cost optimization. Strong proficiency in SQL, Python, and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion, or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms: AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks. Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI, or Looker. Certifications (Preferred/Required): ✅ Snowflake SnowPro Core Certification – Required or Highly Preferred ✅ SnowPro Advanced Architect Certification – Preferred ✅ Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ✅ ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture
Posted 5 days ago
0.0 - 3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Location Bengaluru, Karnataka, India Job ID R-232528 Date posted 28/07/2025 Job Title: Analyst – Data Engineer Introduction to role: Are you ready to make a difference in the world of data science and advanced analytics? As a Data Engineer within the Commercial Strategic Data Management team, you'll play a pivotal role in transforming data science solutions for the Rare Disease Unit. Your mission will be to craft, develop, and deploy data science solutions that have a real impact on patients' lives. By leveraging cutting-edge tools and technology, you'll enhance delivery performance and data engineering capabilities, creating a seamless platform for the Data Science team and driving business growth. Collaborate closely with the Data Science and Advanced Analytics team, US Commercial leadership, Sales Field Team, and Field Operations to build data science capabilities that meet commercial needs. Are you ready to take on this exciting challenge? Accountabilities: Collaborate with the Commercial Multi-functional team to find opportunities for using internal and external data to enhance business solutions. Work closely with business and advanced data science teams on cross-functional projects, delivering complex data science solutions that contribute to the Commercial Organization. Manage platforms and processes for complex projects using a wide range of data engineering techniques in advanced analytics. Prioritize business and information needs with management; translate business logic into technical requirements, such as creating queries, stored procedures, and scripts. Interpret data, process it, analyze results, present findings, and provide ongoing reports. Develop and implement databases, data collection systems, data analytics, and strategies that optimize data efficiency and quality. Acquire data from primary or secondary sources and maintain databases/data systems. Identify and define new process improvement opportunities. Manage and support data solutions in BAU scenarios, including data profiling, designing data flow, creating business alerts for fields, and query optimization for ML models. Essential Skills/Experience: BS/MS in a quantitative field (Computer Science, Data Science, Engineering, Information Systems, Economics) 5+ years of work experience with DB skills like Python, SQL, Snowflake, Amazon Redshift, MongoDB, Apache Spark, Apache Airflow, AWS cloud and Amazon S3 experience, Oracle, Teradata Good experience in Apache Spark or Talend Administration Center or AWS Lambda, MongoDB, Informatica, SQL Server Integration Services Experience in building ETL pipeline and data integration Build efficient Data Management (Extract, consolidate and store large datasets with improved data quality and consistency) Streamlined data transformation: Convert raw data into usable formats at scale, automate tasks, and apply business rules Good written and verbal skills to communicate complex methods and results to diverse audiences; willing to work in a cross-cultural environment Analytical mind with problem-solving inclination; proficiency in data manipulation, cleansing, and interpretation Experience in support and maintenance projects, including ticket handling and process improvement Setting up Workflow Orchestration (Schedule and manage data pipelines for smooth flow and automation) Importance of Scalability and Performance (handling large data volumes with optimized processing capabilities) Experience with Git Desirable Skills/Experience: Knowledge of distributed computing and Big Data Technologies like Hive, Spark, Scala, HDFS; use these technologies along with statistical tools like Python/R Experience working with HTTP requests/responses and API REST services Familiarity with data visualization tools like Tableau, Qlik, Power BI, Excel charts/reports Working knowledge of Salesforce/Veeva CRM, Data governance, and Data mining algorithms Hands-on experience with EHR, administrative claims, and laboratory data (e.g., Prognos, IQVIA, Komodo, Symphony claims data) Good experience in consulting, healthcare, or biopharmaceuticals When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca's Alexion division, you'll find an environment where your work truly matters. Embrace the opportunity to grow and innovate within a rapidly expanding portfolio. Experience the entrepreneurial spirit of a leading biotech combined with the resources of a global pharma. You'll be part of an energizing culture where connections are built to explore new ideas. As a member of our commercial team, you'll meet the needs of under-served patients worldwide. With tailored development programs designed for skill enhancement and fostering empathy for patients' journeys, you'll align your growth with our mission. Supported by exceptional leaders and peers across marketing and compliance, you'll drive change with integrity in a culture celebrating diversity and innovation. Ready to make an impact? Apply now to join our team! Date Posted 29-Jul-2025 Closing Date 04-Aug-2025 Alexion is proud to be an Equal Employment Opportunity and Affirmative Action employer. We are committed to fostering a culture of belonging where every single person can belong because of their uniqueness. The Company will not make decisions about employment, training, compensation, promotion, and other terms and conditions of employment based on race, color, religion, creed or lack thereof, sex, sexual orientation, age, ancestry, national origin, ethnicity, citizenship status, marital status, pregnancy, (including childbirth, breastfeeding, or related medical conditions), parental status (including adoption or surrogacy), military status, protected veteran status, disability, medical condition, gender identity or expression, genetic information, mental illness or other characteristics protected by law. Alexion provides reasonable accommodations to meet the needs of candidates and employees. To begin an interactive dialogue with Alexion regarding an accommodation, please contact accommodations@Alexion.com. Alexion participates in E-Verify.
Posted 5 days ago
5.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Location - Open Position: Data Engineer (GCP) – Technology If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services. Responsibilities: Be an integral part of large scale client business development and delivery engagements Develop the software and systems needed for end-to-end execution on large projects Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Qualifications & Experience: A bachelor’s degree in Computer Science or related field with 5 to 10 years of technology experience Desired Technical Skills: Data Engineering and Analytics on Google Cloud Platform: Basic Cloud Computing Concepts Bigquery, Google Cloud Storage, Cloud SQL, PubSub, Dataflow, Cloud Composer, GCP Data Transfer, gcloud CLI Python, Google Cloud Python SDK, SQL Experience in working with Any NoSQL/Columnar / MPP Database Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.) Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Other Desired Skills: Excellent communication and co-ordination skills Problem understanding, articulation and solutioning Quick learner & adaptable with regards to new technologies Ability to research & solve technical issues Responsibilities: Developing Data Pipelines (Batch/Streaming) Developing Complex data transformations ETL Orchestration Data Migration Develop and Maintain Datawarehouse / Data Lakes Good To Have: Experience in working with Apache Spark / Kafka Machine Learning concepts Google Cloud Professional Data Engineer Certification If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
Posted 6 days ago
4.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Position : Senior Software Engineer / Principal Software Engineer - ETL Experience Expected : 4-6 years. Job Description : Designing, developing, and deploying Data Transformation using SQL portion of the data warehousing solution. Definition and implementation of Database development standards, procedures Skills / Competencies: Ability to develop and debug complex and Advance SQL queries, and stored procedures (must have) Hands-on experience in either one or more of the ETL tools like Talend, Informatica (good to have) Hands on experience on any one streaming tool like DMS, Qlik, Golden gate, IICS , Open Flow Hands on experience using snowflake and Postgres databases Database optimization experience would be an added advantage. (good to have) Excellent design, coding, testing, and debugging skills. Should have experience in AGILE methodologies, experience in custom facing will be an added advantage. (good to have) Automation using phyton, java or any other tool will be an added advantage. (good to have)
Posted 6 days ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role Overview We are looking for an experienced Solution Architect AI/ML & Data Engineering to lead the design and delivery of advanced data and AI/ML solutions for our clients. Responsibilities The ideal candidate will have a strong background in end-to-end data architecture, AI lifecycle management, cloud technologies, and emerging Generative AI Responsibilities : Collaborate with clients to understand business requirements and design robust data solutions. Lead the development of end-to-end data pipelines including ingestion, storage, processing, and visualization. Architect scalable, secure, and compliant data systems following industry best practices. Guide data engineers, analysts, and cross-functional teams to ensure timely delivery of solutions. Participate in pre-sales efforts: solution design, proposal creation, and client presentations. Act as a technical liaison between clients and internal teams throughout the project lifecycle. Stay current with emerging technologies in AI/ML, data platforms, and cloud services. Foster long-term client relationships and identify opportunities for business expansion. Understand and architect across the full AI lifecyclefrom ingestion to inference and operations. Provide hands-on guidance for containerization and deployment using Kubernetes. Ensure proper implementation of data governance, modeling, and warehousing : Bachelors or masters degree in computer science, Data Science, or related field. 10+ years of experience as a Data Solution Architect or similar role. Deep technical expertise in data architecture, engineering, and AI/ML systems. Strong experience with Hadoop-based platforms, ideally Cloudera Data Platform or Data Fabric. Proven pre-sales experience: technical presentations, solutioning, and RFP support. Proficiency in cloud platforms (Azure preferred; also, AWS or GCP) and cloud-native data tools. Exposure to Generative AI frameworks and LLMs like OpenAI and Hugging Face. Experience in deploying and managing applications on Kubernetes (AKS, EKS, GKE). Familiarity with data governance, data modeling, and large-scale data warehousing. Excellent problem-solving, communication, and client-facing & Technology Architecture & Engineering: Hadoop Ecosystem: Cloudera Data Platform, Data Fabric, HDFS, Hive, Spark, HBase, Oozie. ETL & Integration: Apache NiFi, Talend, Informatica, Azure Data Factory, AWS Glue. Warehousing: Azure Synapse, Redshift, BigQuery, Snowflake, Teradata, Vertica. Streaming: Apache Kafka, Azure Event Hubs, AWS Platforms: Azure (preferred), AWS, GCP. Data Lakes: ADLS, AWS S3, Google Cloud Platforms: Data Fabric, AI Essentials, Unified Analytics, MLDM, MLDE. AI/ML & GenAI Lifecycle Tools: MLflow, Kubeflow, Azure ML, SageMaker, Ray. Inference: TensorFlow Serving, KServe, Seldon. Generative AI: Hugging Face, LangChain, OpenAI API (GPT-4, etc. DevOps & Deployment Kubernetes: AKS, EKS, GKE, Open Source K8s, Helm. CI/CD: Jenkins, GitHub Actions, GitLab CI, Azure DevOps. (ref:hirist.tech)
Posted 6 days ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title : Payer Analytics Specialist. Position Summary The Payer Analytics Specialist is responsible for driving insights and supporting decision-making by analyzing healthcare payer data, creating data pipelines, and managing complex analytics projects. This role involves collaborating with cross-functional teams (Operations, Product, IT, and external partners) to ensure robust data integration, reporting, and advanced analytics capabilities. The ideal candidate will have strong technical skills, payer domain expertise, and the ability to manage 3rd-party data sources effectively. Key Responsibilities Data Integration and ETL Pipelines : Develop, maintain, and optimize end-to-end data pipelines, including ingestion, transformation, and loading of internal and external data sources. Collaborate with IT and Data Engineering teams to design scalable, secure, and high-performing data workflows. Implement best practices in data governance, version control, data security, and documentation. Analytics And Reporting Data Analysis : Analyze CPT-level data to identify trends, patterns, and insights relevant to healthcare services and payer rates. Benchmarking : Compare and benchmark rates provided by different health insurance payers within designated zip codes to assess competitive positioning. Build and maintain analytical models for cost, quality, and utilization metrics, leveraging tools such as Python, R, or SQL-based BI tools. Develop dashboards and reports to communicate findings to stakeholders across the organization. 3rd-Party Data Management Ingest and preprocess multiple 3rd party data from multiple sources and transform it into unified structures for analytics and reporting. Ensure compliance with transparency requirements and enable downstream analytics. Design automated workflows to update and validate data, working closely with external vendors and technical teams. Establish best practices for data quality checks (i.e., encounter completeness, claim-level validations) and troubleshooting. Project Management And Stakeholder Collaboration Manage analytics project lifecycles : requirement gathering, project scoping, resource planning, timeline monitoring, and delivery. Partner with key stakeholders (Finance, Operations, Population Health) to define KPIs, data needs, and reporting frameworks. Communicate technical concepts and results to non-technical audiences, providing clear insights and recommendations. Quality Assurance And Compliance Ensure data quality by implementing validation checks, audits, and anomaly detection frameworks. Maintain compliance with HIPAA, HITECH, and other relevant healthcare regulations and data privacy requirements. Participate in internal and external audits of data processes. Continuous Improvement and Thought Leadership. Stay current with industry trends, analytics tools, and regulatory changes affecting payer analytics. Identify opportunities to enhance existing data processes, adopt new technologies, and promote data-driven culture within the organization. Mentor junior analysts and share best practices in data analytics, reporting, and pipeline development. Required Qualifications Education & Experience : Bachelor's degree in Health Informatics, Data Science, Computer Science, Statistics, or a related field (Master's degree a plus). 3-5+ years of experience in healthcare analytics, payer operations, or related fields. Technical Skills Data Integration & ETL : Proficiency in building data pipelines using tools like SQL, Python, R, or ETL platforms (i.e., Talend, Airflow, or Data Factory). Databases & Cloud : Experience working with relational databases (SQL Server, PostgreSQL) and cloud environments (AWS, Azure, GCP). BI & Visualization : Familiarity with BI tools (Tableau, Power BI, Looker) for dashboard creation and data storytelling. MRF, All Claims, & Definitive Healthcare Data : Hands-on experience (or strong familiarity) with healthcare transparency data sets, claims data ingestion strategies, and provider/facility-level data from 3rd-party sources like Definitive Healthcare. Healthcare Domain Expertise Strong understanding of claims data structures (UB-04, CMS-1500), coding systems (ICD, CPT, HCPCS), and payer processes. Knowledge of healthcare regulations (HIPAA, HITECH, transparency rules) and how they impact data sharing and management. Analytical & Problem-Solving Skills Proven ability to synthesize large datasets, pinpoint issues, and recommend data-driven solutions. Comfort with statistical analysis and predictive modeling using Python or R. Soft Skills Excellent communication and presentation skills, with the ability to convey technical concepts to non-technical stakeholders. Strong project management and organizational skills, with the ability to handle multiple tasks and meet deadlines. Collaborative mindset and willingness to work cross-functionally to achieve shared objectives. Preferred/Additional Qualifications Advanced degree (MBA, MPH, MS in Analytics, or similar). Experience with healthcare cost transparency regulations and handling MRF data specifically for compliance. Familiarity with Data Ops or DevOps practices to automate and streamline data pipelines. Certification in BI or data engineering (i.e., Microsoft Certified : Azure Data Engineer, AWS Data Analytics Specialty). Experience establishing data stewardship programs and leading data governance initiatives. Why Join Us Impactful Work - Play a key role in leveraging payer data to reduce costs, improve quality, and shape population health strategies. Innovation - Collaborate on advanced analytics projects using state-of-the-art tools and platforms. Growth Opportunity - Be part of an expanding analytics team where you can lead initiatives, mentor others, and deepen your healthcare data expertise. Supportive Culture - Work in an environment that values open communication, knowledge sharing, and continuous learning. (ref:hirist.tech)
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a System Analyst Datawarehouse at our company, you will be responsible for collaborating with stakeholders to understand business requirements and translate them into data warehouse design specifications. Your role will involve developing and maintaining data warehouse architecture, including data models, ETL processes, and data integration strategies. You will create, optimize, and manage ETL processes to extract, transform, and load data from various source systems into the data warehouse. Ensuring data quality and accuracy during the ETL process by implementing data cleansing and validation procedures will be a key part of your responsibilities. Designing and maintaining data models, schemas, and hierarchies to support efficient data retrieval and reporting will be crucial. You will implement best practices for data modeling, including star schemas, snowflake schemas, and dimension tables. Integrating data from multiple sources, both structured and unstructured, into the data warehouse will be part of your daily tasks. You will work with API endpoints, databases, and flat files to collect and process data efficiently. Monitoring and optimizing the performance of the data warehouse, identifying and resolving bottlenecks and performance issues, will be essential. You will implement indexing, partitioning, and caching strategies for improved query performance. Enforcing data governance policies and security measures to protect sensitive data within the data warehouse will be a priority. You will ensure compliance with data privacy regulations, such as GDPR or HIPAA. Collaborating with business intelligence teams to provide support for reporting and analytics initiatives will also be part of your role. You will assist in the creation of data marts and dashboards for end-users. Maintaining comprehensive documentation of data warehouse processes, data models, and ETL workflows will be crucial. Additionally, you will train and mentor junior data analysts and team members. To qualify for this role, you should have a Bachelor's degree in computer science, information technology, or a related field. A minimum of 3 years of experience as a Data Warehouse Systems Analyst is required. Strong expertise in data warehousing concepts, methodologies, and tools, as well as proficiency in SQL, ETL tools (e.g., Informatica, Talend, SSIS), and data modeling techniques, are essential. Knowledge of data governance, data security, and compliance best practices is necessary. Excellent problem-solving and analytical skills, along with strong communication and interpersonal skills for effective collaboration with cross-functional teams, will be beneficial in this role. Immediate joiners will be preferable for this position. If you meet the qualifications and are looking to join a dynamic team in Mumbai, we encourage you to apply.,
Posted 6 days ago
4.0 - 7.0 years
15 - 25 Lacs
Noida
Work from Office
Role & responsibilities Collaborate with customers' Business and IT teams to understand integration requirements in the B2B/Cloud/API/Data/ETL/EAI Integration space and implement solutions using the Adeptia platform Design, develop, and configure complex integration solutions, ensuring scalability, performance, and maintainability. Take ownership of assigned modules and lead the implementation lifecycle from requirement gathering to production deployment. Troubleshoot issues during implementation and deployment, ensuring smooth system performance. Guide team members in addressing complex integration challenges and promote best practices and performance practices. Collaborate with offshore/onshore and internal teams to ensure timely execution and coordination of project deliverables. Write efficient, well-documented, and maintainable code, adhering to established coding standards. Review code and designs of team members, providing constructive feedback to improve quality. Participate in Agile processes, including Sprint Planning, Daily Standups, and Retrospectives, ensuring effective task management and delivery. Stay updated with emerging technologies to continuously enhance technical expertise and team skills. Preferred candidate profile 5-7 years of hands-on experience in designing and implementing integration solutions across B2B, ETL, EAI, Cloud, API & Data Integration environments using leading platforms such as Adeptia, Talend, MuleSoft, or equivalent enterprise-grade tools. Proficiency in designing and implementing integration solutions, including integration processes, data pipelines, and data mappings, to facilitate the movement of data between applications and platforms. Proficiency in applying data transformation and data cleansing as needed to ensure data quality and consistency across different data sources and destinations. Good experience in performing thorough testing and validation of data integration processes to ensure accuracy, reliability, and data integrity. Proficiency in working with SOA, RESTful APIs, and SOAP Web Services with all security policies. Good understanding and implementation experience with various security concepts, best practices,Security standards and protocols such as OAUTH, SSL/TLS, SSO, SAML, IDP (Identity Provider). Strong understanding of XML, XSD, XSLT, and JSON. Good understanding in RDBMS/NoSQL technologies (MSSQL, Oracle, MySQL). Proficiency with transport protocols (HTTPS, SFTP, JDBC) and experiences of messaging systems such as Kafka, ASB(Azure Service Bus) or RabbitMQ. Hands-on experience in Core Java and exposure to commonly used Java frameworks
Posted 6 days ago
300.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
LSEG (London Stock Exchange Group) is more than a diversified global financial markets infrastructure and data business. We are dedicated, open-access partners with a dedication to excellence in delivering the services our customers expect from us. With extensive experience, deep knowledge and worldwide presence across financial markets, we enable businesses and economies around the world to fund innovation, manage risk and create jobs. It’s how we’ve contributed to supporting the financial stability and growth of communities and economies globally for more than 300 years. Through a comprehensive suite of trusted financial market infrastructure services – and our open-access model – we provide the flexibility, stability and trust that enable our customers to pursue their ambitions with confidence and clarity. People are at the heart of what we do and drive the success of our business. Our culture of connecting, creating opportunity and delivering excellence shape how we think, how we do things and how we help our people fulfil their potential. We embrace diversity and actively seek to attract individuals with unique backgrounds and perspectives. We break down barriers and encourage teamwork, enabling innovation and rapid development of solutions that make a difference. Our workplace generates an enriching and rewarding experience for our people and customers alike. Our vision is to build an inclusive culture in which everyone feels encouraged to fulfil their potential. We know that real personal growth cannot be achieved by simply climbing a career ladder – which is why we encourage and enable a wealth of avenues and interesting opportunities for everyone to broaden and deepen their skills and expertise. As a global organisation spanning 70 countries and one rooted in a culture of growth, opportunity, diversity and innovation, LSEG is a place where everyone can grow, develop and fulfil your potential with meaningful careers. Data Platforms is an exciting collective of teams spanning product and engineering teams. LSEG Data Platforms provides a range of tools and solutions for managing, accessing, and distributing financial data. They offer various platforms, including the LSEG Data Platform, LSEG DataScope Select, and LSEG DataScope Warehouse. These platforms enable users to access both LSEG's own data and third-party data, with options for real-time, delayed, and historical data delivery. Key Responsibilities: Design, develop, and maintain automated test scripts for ETL pipelines and data warehouse solutions. Develop comprehensive test strategies and plans for data validation, data quality, end-to-end testing, and performance testing. Build and implement effective testing strategies aligned with business and technical requirements. Plan, execute, and analyze performance testing to ensure scalability and reliability of data solutions. Collaborate with data engineers, Product Managers, and stakeholders to understand data requirements and business rules. Perform in-depth data testing, including source-to-target mapping, data transformation, and data integrity checks. Implement and maintain test automation frameworks using industry-standard tools (e.g., Python, SQL, Selenium, Informatica, Talend, etc.). Analyze test results, identify data anomalies, and work with development teams to resolve issues. Ensure compliance with financial industry standards and regulatory requirements. Report on test progress, quality metrics, and team performance to management. Required Skills & Experience: 8-10 years of experience in test automation, with a focus on ETL and data warehouse testing. Strong expertise in building testing strategies, planning performance testing, and executing end-to-end testing. Strong expertise in SQL, data profiling, and data validation techniques. Hands-on experience with automation tools and frameworks for ETL/data testing. In-depth understanding of data warehouse concepts, data modeling, and data integration. Exposure to financial industry concepts, products, and regulatory requirements. Proven experience managing and mentoring test teams. Excellent analytical, problem-solving, and communication skills. Preferred Qualifications: Knowledge of scripting languages (Python, Shell, etc.). ISTQB or equivalent testing certification. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.
Posted 6 days ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Persistent We are an AI-led, platform-driven Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world, including 12 of the 30 most innovative global companies, 60% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our disruptor’s mindset, commitment to client success, and agility to thrive in the dynamic environment have enabled us to sustain our growth momentum by reporting $1,409.1M revenue in FY25, delivering 18.8% Y-o-Y growth. Our 23,900+ global team members, located in 19 countries, have been instrumental in helping the market leaders transform their industries. We are also pleased to share that Persistent won in four categories at the prestigious 2024 ISG Star of Excellence™ Awards , including the Overall Award based on the voice of the customer. We were included in the Dow Jones Sustainability World Index, setting high standards in sustainability and corporate responsibility. We were awarded for our state-of-the-art learning and development initiatives at the 16 th TISS LeapVault CLO Awards. In addition, we were cited as the fastest-growing IT services brand in the 2024 Brand Finance India 100 Report. Throughout our market-leading growth, we’ve maintained a strong employee satisfaction score of 8.2/10. At Persistent, we embrace diversity to unlock everyone's potential. Our programs empower our workforce by harnessing varied backgrounds for creative, innovative problem-solving. Our inclusive environment fosters belonging, encouraging employees to unleash their full potential. For more details please login to www.persistent.com About The Position We are looking for a Big Data Lead who will be responsible for the management of data sets that are too big for traditional database systems to handle. You will create, design, and implement data processing jobs in order to transform the data into a more usable format. You will also ensure that the data is secure and complies with industry standards to protect the company?s information. What You?ll Do Manage customer's priorities of projects and requests Assess customer needs utilizing a structured requirements process (gathering, analyzing, documenting, and managing changes) to prioritize immediate business needs and advising on options, risks and cost Design and implement software products (Big Data related) including data models and visualizations Demonstrate participation with the teams you work in Deliver good solutions against tight timescales Be pro-active, suggest new approaches and develop your capabilities Share what you are good at while learning from others to improve the team overall Show that you have a certain level of understanding for a number of technical skills, attitudes and behaviors Deliver great solutions Be focused on driving value back into the business Expertise You?ll Bring 6 years' experience in designing & developing enterprise application solution for distributed systems Understanding of Big Data Hadoop Ecosystem components (Sqoop, Hive, Pig, Flume) Additional experience working with Hadoop, HDFS, cluster management Hive, Pig and MapReduce, and Hadoop ecosystem framework HBase, Talend, NoSQL databases Apache Spark or other streaming Big Data processing, preferred Java or Big Data technologies, will be a plus Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. Inclusive Environment We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent - persistent.com/careers
Posted 6 days ago
4.0 years
3 - 6 Lacs
Hyderābād
On-site
CDP ETL & Database Engineer The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture. The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities: ETL Development – The CDP ETL C Database Engineer will be responsible for building pipelines to feed downstream data They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. Implementations s Onboarding – Will work with the team to onboard new clients onto the ZMP/CDP+ The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests – The CDP ETL C Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management – The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration s Process Improvement – The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements: The CDP ETL C Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management When required, collaborate with the Business Solutions Analyst (BSA) to solidify. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow , and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives. Required Skills: ETL – ETL tools such as Talend (Preferred, not required) DMExpress – Nice to have Informatica – Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL – Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages – Can demonstrate knowledge of any of the PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS – Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value Working knowledge of Code Repositories such as GIT, Win CVS, Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira. Minimum Qualifications: Bachelor's degree or equivalent 4+ Years' experience Excellent verbal C written communications skills Self-Starter, highly motivated Analytical mindset Company Summary: Zeta Global is a NYSE listed data-powered marketing technology company with a heritage of innovation and industry leadership. Founded in 2007 by entrepreneur David A. Steinberg and John Sculley, former CEO of Apple Inc and Pepsi-Cola, the Company combines the industry's 3rd largest proprietary data set (2.4B+ identities) with Artificial Intelligence to unlock consumer intent, personalize experiences and help our clients drive business growth. Our technology runs on the Zeta Marketing Platform, which powers 'end to end' marketing programs for some of the world's leading brands. With expertise encompassing all digital marketing channels – Email, Display, Social, Search and Mobile – Zeta orchestrates acquisition and engagement programs that deliver results that are scalable, repeatable and sustainable. Zeta Global is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, gender, ancestry, color, religion, sex, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law. Zeta Global Recognized in Enterprise Marketing Software and Cross-Channel Campaign Management Reports by Independent Research Firm https://www.forbes.com/sites/shelleykohan/2024/06/1G/amazon-partners-with-zeta-global-to-deliver- gen-ai-marketing-automation/ https://www.cnbc.com/video/2024/05/06/zeta-global-ceo-david-steinberg-talks-ai-in-focus-at-milken- conference.html https://www.businesswire.com/news/home/20240G04622808/en/Zeta-Increases-3Q%E2%80%GG24- Guidance https://www.prnewswire.com/news-releases/zeta-global-opens-ai-data-labs-in-san-francisco-and-nyc- 300S45353.html https://www.prnewswire.com/news-releases/zeta-global-recognized-in-enterprise-marketing-software-and- cross-channel-campaign-management-reports-by-independent-research-firm-300S38241.html
Posted 6 days ago
4.0 - 7.0 years
6 - 7 Lacs
Chennai
On-site
Designation: Senior Analyst Level: L2 Experience: 4 to 7 years Location: Chennai Job Description: We are seeking a highly skilled and motivated Senior Data Quality Analyst (DQA) who is responsible for ensuring the accuracy, completeness, and reliability of an organization’s data, enabling informed decision-making. The ideal candidate works with various Business stakeholders to understand business requirements and define data quality standards, developing and enforcing data validation procedures to ensure compliance with the company’s data standards. Responsibilities: Data Quality Monitoring & Validation (40% of Time): Profile Data: Identify anomalies (missing values, duplicates, outliers) Run Data Quality Checks: Validate against business rules. Automate Checks: Schedule scripts (SQL/Python) to flag issues in real time. Issue Resolution & Root Cause Analysis (30% of Time): Triage Errors: Work with IT/data engineers to fix corrupt data Track Defects: Log issues in Jira/Snowflake and prioritize fixes. Root Cause Analysis: Determine if issues stem from ETL bugs, user input, or system failures. Governance & Documentation (20% of Time): Ensuring compliance with data governance frameworks Metadata Management: Document data lineage. Compliance Audits: Ensure adherence to GDPR, HIPAA, or internal policies. Implementing data quality standards and policies Stakeholder Collaboration (10% of Time): Train Teams: Educate data citizens, data owners, data stewards on data quality best practices. Monitoring and reporting on data quality metrics including Reports to Leaderships. Skills: Technical Skills Knowledge of data quality tools and data profiling techniques (e.g., Talend, Informatica, Ataccama, DQOPS, Open Source tool) Familiarity with database management systems and data governance initiatives Proficiency in SQL and data management principles Experience with data integration and ETL tools Understanding of data visualization tools and techniques Knowledge of data governance and metadata management Familiarity with Python/R for automation and scripting Analytical Skills Strong analytical and problem-solving skills Ability to identify data patterns and trends Understanding of statistical analysis and data quality metrics Experience with data cleansing and data validation techniques including data remediation Ability to assess data quality and identify areas needing improvement Experience with conducting data audits and implementing data quality processes Ability to document data quality rules and procedures Job Snapshot Updated Date 25-07-2025 Job ID J_3911 Location Chennai, Tamil Nadu, India Experience 4 - 7 Years Employee Type Permanent
Posted 6 days ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
5 to 7 years of experience as a Data Engineer. Talend - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data. Good to have Talend knowledge and hands-on experience. Candidates worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and ever improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills Excellent communication & documentation skills. Familiar with Agile delivery process. Must be analytic, creative and self-motivated. Work effectively within a global team environment. Excellent communication skills. Good to have: Production support experience.
Posted 6 days ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Purpose of the Position: As a Talend Data Engineer at Infocepts, you will be responsible for designing, implementing, and managing scalable data pipelines. Location: Pune/Nagpur/Bangalore/Chennai Key Result Areas and Activities: ETL/ELT Solutions Development : Design, develop, and deploy ETL/ELT solutions both on-premise and in the cloud. Data Transformation: Perform data transformation using stored procedures to ensure data is accurately processed and formatted. Report Development: Develop comprehensive reports using tools like MicroStrategy and Power BI to support business decision-making. Documentation and Data Management: Create and maintain detailed documentation for data pipelines, configurations, and processes, ensuring data quality and integrity through effective management practices. Performance Monitoring and Issue Resolution : Monitor and optimize the performance of data pipelines, and troubleshoot and resolve any data-related issues promptly. Essential Skills: Good experience in Talend Suit (Talend Data Integration, Talend Cloud) Good experience in Snowflake & Stored Procedures Experience with ETL/ELT processes, data warehousing, and data modelling Experience with data quality frameworks, monitoring tools, and job scheduling Knowledge of data formats like JSON, XML, CSV, and Parquet Agile methodology & tools like JIRA Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field 4+ years of experience in Talend Qualities: Can influence and implement change; demonstrates confidence, strength of conviction and sound decisions Believes in head-on dealing with a problem; approaches in logical and systematic manner; is persistent and patient; can independently tackle the problem, is not over-critical of the factors that led to a problem and is practical about it; follow up with developers on related issues Able to consult, write, and present persuasively Able to work in a self-organized and cross-functional team
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
Andhra Pradesh
On-site
ABOUT EVERNORTH: Evernorth℠ exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others don’t, won’t or can’t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview Excited to grow your career? This position’s primary responsibility will be to translate software requirements into functions using Mainframe , ETL , Data Engineering with expertise in Databricks and Database technologies. This position offers the opportunity to work on modernizing legacy systems, contribute to cloud infrastructure automation, and support production systems in a fast-paced, agile environment. You will work across multiple teams and technologies to ensure reliable, high-performance data solutions that align with business goals. As a Mainframe & ETL Engineer, you will be responsible for the end-to-end development and support of data processing solutions using tools such as Talend, Ab Initio, AWS Glue, and PySpark, with significant work on Databricks and modern cloud data platforms. You will support infrastructure provisioning using Terraform, assist in modernizing legacy systems including mainframe migration, and contribute to performance tuning of complex SQL queries across multiple database platforms including Teradata, Oracle, Postgres, and DB2. You will also be involved in CI/CD practices Responsibilities Support, maintain and participate in the development of software utilizing technologies such as COBOL, DB2, CICS and JCL. Support, maintain and participate in the ETL development of software utilizing technologies such as Talend, Ab-Initio, Python, PySpark using Databricks. Work with Databricks to design and manage scalable data processing solutions. Implement and support data integration workflows across cloud (AWS) and on-premises environments. Support cloud infrastructure deployment and management using Terraform. Participate in the modernization of legacy systems, including mainframe migration. Perform complex SQL queries and performance tuning on large datasets. Contribute to CI/CD pipelines, version control, and infrastructure automation. Provide expertise, tools, and assistance to operations, development, and support teams for critical production issues and maintenance Troubleshoot production issues, diagnose the problem, and implement a solution - First line of defense in finding the root cause Work cross-functionally with the support team, development team and business team to efficiently address customer issues. Active member of high-performance software development and support team in an agile environment Engaged in fostering and improving organizational culture. Qualifications Required Skills: Strong analytical and technical skills. Proficiency in Databricks – including notebook development, Delta Lake, and Spark-based process. Experience with mainframe modernization or migrating legacy systems to modern data platforms. Strong programming skills, particularly in PySpark for data processing. Familiarity with data warehousing concepts and cloud-native architecture. Solid understanding of Terraform for managing infrastructure as code on AWS. Familiarity with CI/CD practices and tools (e.g., Git, Jenkins). Strong SQL knowledge on OLAP DB platforms (Teradata, Snowflake) and OLTP DB platforms (Oracle, DB2, Postgres, SingleStore). Strong experience with Teradata SQL and Utilities Strong experience with Oracle, Postgres and DB2 SQL and Utilities Develop high quality database solutions Ability to do extensive analysis on complex SQL processes and design skills Ability to analyze existing SQL queries for performance improvements Experience in software development phases including design, configuration, testing, debugging, implementation, and support of large-scale, business centric and process-based applications Proven experience working with diverse teams of technical architects, business users and IT areas on all phases of the software development life cycle. Exceptional analytical and problem-solving skills Structured, methodical approach to systems development and troubleshooting Ability to ramp up fast on a system architecture Experience in designing and developing process-based solutions or BPM (business process management) Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong interpersonal/relationship management skills. Strong time and project management skills. Familiarity with agile methodology including SCRUM team leadership. Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Desire to work in application support space Passion for learning and desire to explore all areas of IT. Required Experience & Education: Minimum of 8-12 years of experience in application development role. Bachelor’s degree equivalent in Information Technology, Business Information Systems, Technology Management, or related field of study. Location & Hours of Work: Hyderabad and Hybrid (13:00 AM IST to 10:00 PM IST) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 6 days ago
3.0 - 7.0 years
0 - 2 Lacs
Chennai, Coimbatore, Bengaluru
Work from Office
Required Skill Set Talend: Hands-on experience with Talend Studio and Talend Management Console (TMC) Strong understanding of Joblets, PreJobs, PostJobs, SubJobs , and the overall Talend job design flow Proficiency in Talend components such as S3, Redshift, tDBInput, tMap , and Java-based components PySpark: Solid knowledge of PySpark Ability to analyze, compare, and validate migrated PySpark code against Talend job definitions to ensure accurate migration Additional Skills: AWS Ecosystem: S3, Glue, CloudWatch, SSM, IAM, etc. Databases: Redshift, Aurora, Teradata
Posted 6 days ago
6.0 - 11.0 years
22 - 27 Lacs
Pune, Bengaluru
Work from Office
Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary: We are seeking experienced ETL and MDM Testers with a strong background in GxP validation to join our QA team. The ideal candidates should have hands-on experience in testing data pipelines, master data management systems, and ensuring compliance with regulatory requirements for life sciences or pharmaceutical environments. This role requires working out of the office with immediate availability. Key Responsibilities: Design, develop, and execute test cases for ETL and MDM solutions. Validate data transformations, data loads, and data quality across various layers of the data pipeline. Perform source-to-target mapping verification and test data lineage and integrity. Conduct functional, integration, and regression testing of MDM systems. Review and validate data models and business rules used in MDM processes. Execute testing and documentation as per GxP compliance and 21 CFR Part 11 requirements. Ensure traceability between requirements, test cases, test execution, and defects. Participate in risk-based validation activities and author/review validation deliverables (Test Plans, Protocols, Traceability Matrix, Summary Reports). Collaborate with cross-functional teams including Business Analysts, Developers, and Validation Leads. Log, track, and manage defects using tools such as JIRA or HP ALM. Must-Have Skills: 3–5 years of hands-on experience in ETL and MDM testing. Good understanding of data warehouses, data lakes, and MDM architecture. Strong SQL skills for data validation and troubleshooting. Hands-on experience in GxP validation projects (CSV lifecycle, IQ/OQ/PQ). Working knowledge of regulatory compliance (21 CFR Part 11, GAMP 5, ALCOA+ principles). Experience with tools such as Informatica, Talend, IBM MDM, or equivalent. Familiarity with defect tracking and test management tools (e.g., JIRA, HP ALM, TestRail).
Posted 1 week ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. About Client: Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations. Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines. Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries. Job Title: Senior Data Engineer / ETL Developer (Talend + Snowflake) Experience: 10+ Years Location: Chennai Employment Type: Contract Who are we looking for? A seasoned IT professional with 10+ years of overall experience in software development and data engineering, possessing strong communication skills with the ability to analyze, develop, and present solutions clearly to senior management. Key Responsibilities: Analyze business and technical requirements to design and develop data pipelines using Talend ETL . Design and implement scalable solutions on Snowflake Cloud Data Warehouse . Manage and maintain infrastructure on AWS Cloud . Work in an Agile environment , contribute to sprints, and participate in daily stand-ups and retrospectives. Prepare and peer review Low-Level Design (LLD) documentation. Design and execute test plans , develop test cases , and ensure high-quality code through rigorous testing. Share knowledge and mentor junior developers , helping improve team efficiency and agile practices. Ensure adherence to defined software development life cycle (SDLC) processes. Must-Have Technical Skills: Strong experience in Talend ETL development. Hands-on experience with Snowflake Cloud Data Warehouse . Proficiency in AWS Cloud services and deployment.
Posted 1 week ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. About Client: Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations. Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines. Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries. Job Description Role: Senior Talend developer Location: Offshore/India (Hybrid work model) Who are we looking for? Overall 10+ years of IT development experience with exceptional communication skills, including ability to develop and present clear and concise analysis and recommendations to senior management and will play the role of an architect. Technical Skills : Must to have Ø Experience with Talend ETL Integration suite – relevant exp must be 7 yrs Ø Experience with Snowflake Cloud Warehouse Ø Experience in AWS Cloud (Any cloud exp is also fine) Ø Handson exp in (Python or Core Java) Good to have Ø RDBMS Databases – Oracle Ø Experience with CI/CD and DevOps development. Ø Working knowledge of Unix/Linux and shellscript Process Skills: · Capable of analyzing requirements and develop software as per project defined software process. · Develop and peer review of LLD (Initiate/ participate in peer reviews) · Work on the agile improvements by sharing experiences and knowledge with the team. · Ability to execute test plans, create test cases test data and contribute to appropriate software (including programming languages) development life cycle methodology and application of specialized business and technical knowledge. Behavioral Skills : · Quick learner and passionate to learn new technologies/products. · Participates as a team member and engages in teamwork within other applications in the portfolio. · Effectively collaborates and communicates with the stakeholders and ensure client satisfaction. · Capability to approach problems as challenges and deal with them constructively while promoting this approach to other team members. Qualification: · Somebody who has at least 10+ years of work experience in development. Education qualification: Any degree from a reputed college
Posted 1 week ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Primary skills:Technology->Data Management - Data Integration->Talend A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 4-6 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
P1,C3,STS Design, build data cleansing and imputation, map to a standard data model, transform to satisfy business rules and statistical computations, and validate data content. Develop, modify, and maintain Python and Unix Scripts, and complex SQL Performance tuning of the existing code and avoid bottlenecks and improve performance Build an end-to-end data flow from sources to entirely curated and enhanced data sets. Develop automated Python jobs for ingesting data from various source systems Provide technical expertise in areas of architecture, design, and implementation. Work with team members to create useful reports and dashboards that provide insight, improve/automate processes, or otherwise add value to the team. Write SQL queries for data validation. Design, develop and maintain ETL processess to extract, transform and load Data from various sources into the data warehours Colloborate with data architects, analysts and other stake holders to understand data requirement and ensure quality Optimize and tune ETL processes for performance and scalaiblity Develop and maintain documentation for ETL processes, data flows, and data mappings Monitor and trouble shoot ETL processes to ensure data accuracy and availability Implement data validation and error handling mechanisms Work with large data sets and ensure data integrity and consistency Skills Python, ETL Tools like Informatica, Talend, SSIS or similar SQL, Mysql, Expertise in Oracle, SQL Server and Teradata DeV Ops, GIT Lab Exp in AWS glue or Azure data factory
Posted 1 week ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture.The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities ETL Development – The CDP ETL C Database Engineer will be responsible for building pipelines to feed downstream data They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. Implementations s Onboarding – Will work with the team to onboard new clients onto the ZMP/CDP+ The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests– The CDP ETL C Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management – The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration s Process Improvement – The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling. ETL and FTP concepts. Advanced Analytics using SQL Functions. Cloud technologies - AWS, Snowflake. Able to decipher requirements, provide recommendations, and implement solutions within predefined. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management. When required, collaborate with the Business Solutions Analyst (BSA) to solidify. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow , and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives. Required Skills ETL – ETL tools such as Talend (Preferred, not required) DMExpress – Nice to have Informatica – Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL – Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages – Can demonstrate knowledge of any of the PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS – Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value Working knowledge of Code Repositories such as GIT, Win CVS, Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent. 4+ Years' experience. Excellent verbal C written communications skills. Self-Starter, highly motivated. Analytical mindset. Company Summary Zeta Global is a NYSE listed data-powered marketing technology company with a heritage of innovation and industry leadership. Founded in 2007 by entrepreneur David A. Steinberg and John Sculley, former CEO of Apple Inc and Pepsi-Cola, the Company combines the industry’s 3rd largest proprietary data set (2.4B+ identities) with Artificial Intelligence to unlock consumer intent, personalize experiences and help our clients drive business growth. Our technology runs on the Zeta Marketing Platform, which powers ‘end to end’ marketing programs for some of the world’s leading brands. With expertise encompassing all digital marketing channels – Email, Display, Social, Search and Mobile – Zeta orchestrates acquisition and engagement programs that deliver results that are scalable, repeatable and sustainable. Zeta Global is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, gender, ancestry, color, religion, sex, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law. Zeta Global Recognized in Enterprise Marketing Software and Cross-Channel Campaign Management Reports by Independent Research Firm https://www.forbes.com/sites/shelleykohan/2024/06/1G/amazon-partners-with-zeta-global-to-deliver- gen-ai-marketing-automation/ https://www.cnbc.com/video/2024/05/06/zeta-global-ceo-david-steinberg-talks-ai-in-focus-at-milken- conference.html https://www.businesswire.com/news/home/20240G04622808/en/Zeta-Increases-3Q%E2%80%GG24- Guidance https://www.prnewswire.com/news-releases/zeta-global-opens-ai--data-labs-in-san-francisco-and-nyc- 300S45353.html https://www.prnewswire.com/news-releases/zeta-global-recognized-in-enterprise-marketing-software-and- cross-channel-campaign-management-reports-by-independent-research-firm-300S38241.html
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough