Home
Jobs

607 Dataflow Jobs - Page 8

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are seeking an experienced Lead Data Engineer to join our dynamic team. As a Lead Data Engineer, you will be responsible for designing, developing, and maintaining data integration solutions for our clients. You will lead a team of engineers to ensure the delivery of high-quality, scalable, and performant data integration solutions. This is an exciting opportunity for a seasoned data integration professional passionate about technology and who thrives in a fast-paced, dynamic environment. Responsibilities Design, develop, and maintain data integration solutions for clients Lead a team of engineers to ensure the delivery of high-quality, scalable, and performant data integration solutions Collaborate with cross-functional teams to understand business requirements and design data integration solutions that meet those requirements Ensure data integration solutions are secure, reliable, and performant Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Continuously learn and stay up-to-date with the latest data integration approaches and tools Requirements Bachelor's degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or a related field Experience with cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for querying and manipulating data Experience with Snowflake for cloud data warehousing Experience with at least one cloud platform such as AWS, Azure, or GCP Experience leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Experience with ETL using Python Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description Responsibilities: Implement data pipelines that meet design and are efficient, scalable, and maintainable Implement best practices including proper use of source control, participation in code reviews, data validation and testing Timely deliveries while working on projects Act as advisor/mentor and helps junior data engineers in their deliverables Must Have Skills: Should have experience of at least 4+ years with Data Engineering Strong experience of design, implementation and fine-tuning big data processing pipelines in production environment Experience with big tools like Hadoop, Spark, Kafka, Hive, Databricks Experience in programming at least one of with Python, Java, Scala, Shell Script Experience with relational SQL and NO SQL databases like PostgresSQL, MYSQL, Cassandra etc. Experience with any data visualization tool (Plotly, Tableau, Power BI, Google Data Studio, Quick sight etc.) Good To Have Skills: Should have Basic Knowledge of CI/CD Pipeline Experience in working on at least one Cloud (AWS or Azure or GCP) For AWS: - Experience with AWS Cloud services like EC2, S3, EMR, RDS, Athena, Glue, Lambda, EMR For Azure: -Experience with Azure Cloud services like Azure Blob/Data Lake GEN2, Delta Lake, Databricks, Azure SQL, Azure DevOps, Azure Data Factory, Power BI For GCP: - Experience with GCP Cloud services Big Query, Cloud Storage bucket, DataProc, Dataflow, Pub Sub, Cloud Function, Data Studio Sound familiarity in Versioning tools (Git, SVN etc.) Experience Mentoring students is desirable Knowledge of latest developments in Machine Learning, Deep Learning, Optimization in Automotive domain. Open minded approach to explore multiple algorithms to design optimal solution. History of contribution to articles/blogs/whitepapers etc. in Analytics History of contribution to Open Source. Required Skills Data Engineering,Hadoop,Kafka,CI/CD,Cloud Supported Skills Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Minimum qualifications: Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. 8 years of experience with software development in one or more programming languages (e.g., Python, C, C++, Java, JavaScript). Experience in one or more disciplines: Machine Learning, Recommendation Systems, Natural Language Processing, Computer Vision, Pattern Recognition, or Artificial Intelligence. Preferred qualifications: Understanding of agentic experience/AIML, Large Language Model (LLM), and strong coding skills. About The Job Like Google's own ambitions, the work of a Software Engineer goes beyond just Search. Software Engineering Managers have not only the technical expertise to take on and provide technical leadership to major projects, but also manage a team of Engineers. You not only optimize your own code but make sure Engineers are able to optimize theirs. As a Software Engineering Manager you manage your project goals, contribute to product strategy and help develop your team. Teams work all across the company, in areas such as information retrieval, artificial intelligence, natural language processing, distributed computing, large-scale system design, networking, security, data compression, user interface design; the list goes on and is growing every day. Operating with scale and speed, our exceptional software engineers are just getting started -- and as a manager, you guide the way. With technical and leadership expertise, you manage engineers across multiple teams and locations, a large product budget and oversee the deployment of large-scale projects across multiple sites internationally. At Corp Eng, we build world-leading business solutions that scale a more helpful Google for everyone. As Google’s IT organization, we provide end-to-end solutions for organizations across Google. We deliver the right tools, platforms, and experiences for all Googlers as they create more helpful products and services for everyone. In the simplest terms, we are Google for Googlers. Responsibilities Lead and manage a team of AI Software Engineers, fostering a collaborative and high-performing environment. This includes hiring, mentoring, performance management, and career development. Drive the design, development, and deployment of scalable and reliable AI/ML systems and infrastructure relevant to HR applications (e.g., talent acquisition, performance management, employee engagement, workforce planning). Collaborate with Product Managers and HR stakeholders to understand business needs, define product requirements, and translate them into technical specifications and project plans. Oversee the architecture and implementation of robust data pipelines using Google's data processing infrastructure (e.g., Beam, Dataflow) to support AI/ML initiatives. Stay abreast of the latest advancements in AI/ML and related technologies, evaluating their potential application within Human Resources and guiding the team's adoption of relevant innovations. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 1 week ago

Apply

4.0 - 12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Position: We are looking for candidates having experience on Spring boot , Kafka , Microservices . Role: Java Developer Location: Hyderabad Experience: 4 to 12 years Job Type: Full Time Employment What You'll Do: Design, develop, test, and deploy scalable and resilient microservices using Java, Spring Boot, Spring Cloud, and Dataflow. Collaborate with other developers, architects, and product owners to deliver high-quality software solutions that meet business requirements and follow best practices. Contribute to code reviews, documentation, testing, and continuous integration and delivery processes. Troubleshoot and resolve issues in development, testing, and production environments. Stay updated with the latest trends and technologies in the Java ecosystem and cloud computing. Expertise You'll Bring: Hands on experience on Spring , Spring Boot(annotations) , Micro Services, Junit Coding standards and able to clear 1 java complex coding question (Data Structures, OOPS, Multi-threading, Sync/Async calls, Java 8 features) Leading the team technically and doing code reviews(tools used) Experience on any one messaging system (Kafka, RabbitMQ) Any cloud experience Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent “Persistent is an Equal Opportunity Employer and prohibits discri mination and harassment of any kind.” Show more Show less

Posted 1 week ago

Apply

8.0 - 13.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are searching for a skilled Lead Data Engineer to enhance our energetic team. In the role of Lead Data Engineer, you will take charge of creating, building, and sustaining data integration solutions for our clientele. Your leadership will guide a team of engineers towards achieving high-quality, scalable, and efficient data integration solutions. This role presents a thrilling challenge for an experienced data integration expert who is enthusiastic about technology and excels in a rapid, evolving setting. Responsibilities Design, build, and sustain data integration solutions for clients Guide a team of engineers to guarantee high-quality, scalable, and efficient data integration solutions Collaborate with multidisciplinary teams to grasp business needs and devise appropriate data integration solutions Ensure the security, reliability, and efficiency of data integration solutions Create and update documentation, such as technical specifications, data flow diagrams, and data mappings Continuously update knowledge and skills related to the latest data integration methods and tools Requirements Bachelor's degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or a related field Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Background in Snowflake for cloud data warehousing Familiarity with at least one cloud platform such as AWS, Azure, or GCP Experience in leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Our organization is seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects related to data integration and ETL on cloud-based platforms. You will take charge of creating and executing sophisticated data solutions, ensuring data accuracy, dependability, and accessibility. Responsibilities Create and execute sophisticated data solutions on cloud-based platforms Build ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and sustain documentation, including technical specifications, data flow diagrams, and data mappings Enhance and tune data integration processes for optimal performance and efficiency, guaranteeing data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Qualifications in Snowflake for data warehousing Familiarity with cloud platforms like AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Minimum qualifications: Bachelor's degree or equivalent practical experience. 5 years of experience with software development in one or more programming languages, and with data structures/algorithms. 3 years of experience testing, maintaining, or launching software products, and 1 year of experience with software design and architecture. 3 years of experience with one or more of the following: Speech/audio (e.g., technology duplicating and responding to the human voice), reinforcement learning (e.g., sequential decision making), ML infrastructure, or specialization in another ML field. 3 years of experience with ML infrastructure (e.g., model deployment, model evaluation, optimization, data processing, debugging). Preferred qualifications: 7 years of experience in software-development with focus on design, development, and deployment of large-scale AI/ML applications. Experience in Python and with AI/ML libraries and frameworks (e.g., TensorFlow, JAX, scikit-learn, Pandas, NumPy). Experience architecting and deploying machine learning models in a cloud environment, with experience on Google Cloud Platform. Experience designing and implementing data processing pipelines using large-scale data engineering tools. Understanding of machine learning algorithms, statistical modeling techniques, and data analysis methodologies. About The Job Google's software engineers develop the next-generation technologies that change how billions of users connect, explore, and interact with information and one another. Our products need to handle information at massive scale, and extend well beyond web search. We're looking for engineers who bring fresh ideas from all areas, including information retrieval, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design and mobile; the list goes on and is growing every day. As a software engineer, you will work on a specific project critical to Google’s needs with opportunities to switch teams and projects as you and our fast-paced business grow and evolve. We need our engineers to be versatile, display leadership qualities and be enthusiastic to take on new problems across the full-stack as we continue to push technology forward. We are seeking an experienced Senior AI Software Engineer to join the Human Resources Engineering (HRE) team in Hyderabad. You will be a technical lead, driving the design, development, and deployment of AI-powered solutions that impact Google's employees. This role offers the opportunity to work on Google-scale infrastructure, mentor a team of engineers, collaborate with cross-functional partners, and contribute to the future of HR at Google through AI applications. You will be responsible for translating business challenges and objectives into AI/ML systems. At Corp Eng, we build world-leading business solutions that scale a more helpful Google for everyone. As Google’s IT organization, we provide end-to-end solutions for organizations across Google. We deliver the right tools, platforms, and experiences for all Googlers as they create more helpful products and services for everyone. In the simplest terms, we are Google for Googlers. Responsibilities Lead the technical design and architecture of AI/ML systems and infrastructure within HRE domain, ensuring scalability, reliability, and performance. Drive the development and implementation of advanced AI/ML models and algorithms relevant to HR applications (e.g., talent acquisition, performance management, employee engagement, workforce planning) using various programming languages (e.g., Python) and Google's internal AI/ML platforms and frameworks (built on technologies like TensorFlow, JAX). Architect and oversee the development of data pipelines using Google's data processing infrastructure (e.g., Beam, Dataflow) for large-scale data-ingestion, cleaning, transformation, and feature engineering. Take ownership of deployment, monitoring, and optimization of AI/ML models in Google's production environments, establishing standard procedures for ML Operations within the team. Provide technical leadership and mentorship to other engineers on the team, fostering a culture of technical excellence. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About Traya Health: Traya is an Indian direct-to-consumer hair care brand platform providing a holistic treatment for consumers dealing with hair loss. The Company provides personalised consultations that help determine the root cause of hair fall among individuals, along with a range of hair care products that are curated from a combination of Ayurveda, Allopathy, and Nutrition. Traya's secret lies in the power of diagnosis. Our unique platform diagnoses the patient’s hair & health history, to identify the root cause behind hair fall and delivers customized hair kits to them right at their doorstep. We have a strong adherence system in place via medically-trained hair coaches and proprietary tech, where we guide the customer across their hair growth journey, and help them stay on track. Traya is founded by Saloni Anand, a techie-turned-marketeer and Altaf Saiyed, a Stanford Business School alumnus. Our Vision: Traya was created with a global vision to create awareness around hair loss, de-stigmatise it while empathising with the customers that it has an emotional and psychological impact. Most importantly, to combine 3 different sciences (Ayurveda, Allopathy and Nutrition) to create the perfect holistic solution for hair loss patients. Role Overview: As a Senior Data Engineer, you will architect, build, and maintain our data infrastructure that powers critical business decisions. You will work closely with data scientists, analysts, and product teams to design and implement scalable solutions for data processing, storage, and retrieval. Your work will directly impact our ability to leverage data for business intelligence, machine learning initiatives, and customer insights. Key Responsibilities: ● Design, build, and maintain our end-to-end data infrastructure on AWS and GCP cloud platforms ● Develop and optimize ETL/ELT pipelines to process large volumes of data from multiple sources ● Build and support data pipelines for reporting, analytics, and machine learning applications ● Implement and manage streaming data solutions using Kafka and other technologies ● Design and optimize database schemas and data models in ClickHouse and other databases ● Develop and maintain data workflows using Apache Airflow and similar orchestration tools ● Write efficient, maintainable, and scalable code using PySpark and other data processing frameworks ● Collaborate with data scientists to implement ML infrastructure for model training and deployment ● Ensure data quality, reliability, and security across all data platforms ● Monitor data pipelines and implement proactive alerting systems ● Troubleshoot and resolve data infrastructure issues ● Document data flows, architectures, and processes ● Mentor junior data engineers and contribute to establishing best practices ● Stay current with industry trends and emerging technologies in data engineering Qualifications Required ● Bachelor's degree in Computer Science, Engineering, or related technical field (Master's preferred) ● 5+ years of experience in data engineering roles ● Strong expertise in AWS and/or GCP cloud platforms and services ● Proficiency in building data pipelines using modern ETL/ELT tools and frameworks ● Experience with stream processing technologies such as Kafka ● Hands-on experience with ClickHouse or similar analytical databases ● Strong programming skills in Python and experience with PySpark ● Experience with workflow orchestration tools like Apache Airflow ● Solid understanding of data modeling, data warehousing concepts, and dimensional modeling ● Knowledge of SQL and NoSQL databases ● Strong problem-solving skills and attention to detail ● Excellent communication skills and ability to work in cross-functional teams Preferred ● Experience in D2C, e-commerce, or retail industries ● Knowledge of data visualization tools (Tableau, Looker, Power BI) ● Experience with real-time analytics solutions ● Familiarity with CI/CD practices for data pipelines ● Experience with containerization technologies (Docker, Kubernetes) ● Understanding of data governance and compliance requirements ● Experience with MLOps or ML engineering Technologies ● Cloud Platforms: AWS (S3, Redshift, EMR, Lambda), GCP (BigQuery, Dataflow, Dataproc) ● Data Processing: Apache Spark, PySpark, Python, SQL ● Streaming: Apache Kafka, Kinesis ● Data Storage: ClickHouse, S3, BigQuery, PostgreSQL, MongoDB ● Orchestration: Apache Airflow ● Version Control: Git ● Containerization: Docker, Kubernetes (optional) What We Offer ● Competitive salary and comprehensive benefits package ● Opportunity to work with cutting-edge data technologies ● Professional development and learning opportunities ● Modern office in Mumbai with great amenities ● Collaborative and innovation-driven culture ● Opportunity to make a significant impact on company growth Show more Show less

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Impetus Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon. Experience- 3-8 years Location- Gurgaon & Bangalore Job Description You should have extensive production experience in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must Roles & Responsibilities Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description We are looking for “Data Solution Architect” to join FC India IT Architecture team. In this role, you will define analytics solutions and guide engineering teams to implement big data solutions on the cloud. Work involves migrating data from legacy on-prem warehouses to Google cloud data platform. This role will provide architecture assistance to data engineering teams in India, with key responsibility of supporting applications globally. This role will also drive business adoption of the new platform and sunset of legacy platforms. Responsibilities Utilize Google Cloud Platform & Data Services to modernize legacy applications. Understand technical business requirements and define architecture solutions that align to Ford Motor & Credit Companies Patterns and Standards. Collaborate and work with global architecture teams to define analytics cloud platform strategy and build Cloud analytics solutions within enterprise data factory. Provide Architecture leadership in design & delivery of new Unified data platform on GCP. Understand complex data structures in analytics space as well as interfacing application systems. Develop and maintain conceptual, logical & physical data models. Design and guide Product teams on Subject Areas and Data Marts to deliver integrated data solutions. Provide architectural guidance for optimal solutions considering regional Regulatory needs. Provide architecture assessments on technical solutions and make recommendations that meet business needs and align with architectural governance and standard. Guide teams through the enterprise architecture processes and advise teams on cloud-based design, development, and data mesh architecture. Provide advisory and technical consulting across all initiatives including PoCs, product evaluations and recommendations, security, architecture assessments, integration considerations, etc. Leverage cloud AI/ML Platforms to deliver business and technical requirements. Qualifications Google Professional Solution Architect certification. 8+ years of relevant work experience in analytics application and data architecture, with deep understanding of cloud hosting concepts and implementations. 5+ years’ experience in Data and Solution Architecture in analytics space. Solid knowledge of cloud data architecture, data modelling principles, and expertise in Data Modeling tools. Experience in migrating legacy analytics applications to Cloud platform and business adoption of these platforms to build insights and dashboards through deep knowledge of traditional and cloud Data Lake, Warehouse and Mart concepts. Good understanding of domain driven design and data mesh principles. Experience with designing, building, and deploying ML models to solve business challenges using Python/BQML/Vertex AI on GCP. Knowledge of enterprise frameworks and technologies. Strong in architecture design patterns, experience with secure interoperability standards and methods, architecture tolls and process. Deep understanding of traditional and cloud data warehouse environment, with hands on programming experience building data pipelines on cloud in a highly distributed and fault-tolerant manner. Experience using Dataflow, pub/sub, Kafka, Cloud run, cloud functions, Bigquery, Dataform, Dataplex , etc. Strong understanding on DevOps principles and practices, including continuous integration and deployment (CI/CD), automated testing & deployment pipelines. Good understanding of cloud security best practices and be familiar with different security tools and techniques like Identity and Access Management (IAM), Encryption, Network Security, etc. Strong understanding of microservices architecture. Nice to Have Bachelor’s degree in Computer science/engineering, Data science or related field. Strong leadership, communication, interpersonal, organizing, and problem-solving skills Good presentation skills with ability to communicate architectural proposals to diverse audiences (user groups, stakeholders, and senior management). Experience in Banking and Financial Regulatory Reporting space. Ability to work on multiple projects in a fast paced & dynamic environment. Exposure to multiple, diverse technologies, platforms, and processing environments. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Lead a team of Software Engineers to design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.). Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity. What Experience You Need Bachelor's degree or equivalent experience 10+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs. What Could Set You Apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 1 week ago

Apply

5.0 - 7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We are seeking an experienced Senior Power Platform Engineer with strong expertise in Microsoft Copilot Studio, Power BI development, and MS Fabric to join our digital transformation team. In this role, you will design, develop, and implement intelligent analytics and automation solutions that drive data-informed decision making and enhance business efficiency across our organization. Key Responsibilities Lead the design and implementation of advanced AI-powered solutions using Microsoft Copilot Studio and the broader Power Platform ecosystem Create sophisticated Power BI reports, dashboards, and data models that deliver actionable business insights Architect end-to-end data analytics solutions using Microsoft Fabric, integrating data engineering, data science, and business intelligence workflows Develop data pipelines and ETL processes using MS Fabric's DataFlow and Data Factory capabilities Implement semantic models and establish data governance frameworks within Power BI and MS Fabric environments Design and build conversational AI experiences that can surface data insights through natural language queries Create integrated solutions that connect Power BI analytics with Copilot Studio conversational experiences Mentor junior developers on best practices for data visualization and Power BI development Optimize performance of existing Power BI reports and datasets for large-scale enterprise deployment Qualifications 5-7 years of experience with Microsoft Power Platform, with strong focus on Power BI development 2-3 years working with Microsoft Copilot Studio/Power Virtual Agents Demonstrated experience with Microsoft Fabric (OneLake, Synapse Analytics, Data Factory) Expertise in DAX, M language, and Power Query for advanced data modeling Strong understanding of data warehousing concepts and dimensional modeling Experience implementing row-level security and managing workspaces in Power BI Knowledge of integration patterns between Power BI and other Power Platform components Experience with Dataverse data modeling and integration Microsoft certifications in Power BI (PL-300), Data Analytics (DP-500), or Power Platform (PL-200) preferred Technical Skills Microsoft Power BI (advanced) - DAX, Power Query, report design, paginated reports Microsoft Fabric (advanced) - OneLake, Data Factory, Synapse Analytics Microsoft Copilot Studio (advanced) Power Automate (intermediate to advanced) Power Apps (intermediate) Dataverse/Common Data Service AI Builder and Azure OpenAI integration SQL and data modeling expertise Experience with large dataset optimization techniques Knowledge of DirectQuery, Import, and Composite models Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Responsibilities Job Description Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks Work with development teams to design the GCP-specific cloud architecture Provisioning and de-provisioning GCP accounts and resources for internal projects. Managing, and operating multiple GCP subscriptions Keep technical documentation up to date Proactively being up to date on GCP announcements, services and developments. Requirements Must have 8+ years of work experience on provisioning, operating, and maintaining systems in GCP Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs Must have a working knowledge of docker containers and Kubernetes. Must have strong communication skills and the ability to work both independently and in a collaborative environment. Fast learner, Achiever, sets high personal goals Must be able to work on multiple projects and consistently meet project deadlines Must be willing to work on shift-basis based on project requirements. Good To Have Experience in Terraform Automation over GCP Infrastructure provisioning Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services Experience in building and supporting any form of data pipeline. Multi-Cloud experience with AWS. New-Relic monitoring. Perks Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Paid paternity and maternity leaves Qualifications Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About Us : At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Required Skills: Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development "scrums" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). Qualifications: Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About Us : At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Required Skills : Minimum 6 years of experience in Architectecture, Design and building data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Lead agile development "scrums" and solution reviews. Mentor junior Data Engineering Specialists. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Demonstrate expertise in SQL and database proficiency in various data engineering tasks. Automate complex data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop and manage Unix scripts for data engineering tasks. Intermediate proficiency in infrastructure-as-code tools like Terraform, Puppet, and Ansible to automate infrastructure deployment. Proficiency in data modeling to support analytics and business intelligence. Working knowledge of ML Ops to integrate machine learning workflows with data pipelines. Extensive expertise in GCP technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion, Dataproc (good to have), and BigTable. Advanced proficiency in programming languages (Python). Qualifications: Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. Analytics certification in BI or AI/ML. 6+ years of data engineering experience. 4 years of data platform solution architecture and design experience. GCP Certified Data Engineer (preferred). Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About Us : At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Required Skills : 5+ years of industry experience in data engineering, business intelligence, or a related field with experience in manipulating, processing, and extracting value from datasets. Expertise in architecting, designing, building, and deploying internal applications to support technology life cycle management, service delivery management, data, and business intelligence. Experience in developing modular code for versatile pipelines or complex ingestion frameworks aimed at loading data into Cloud SQL and managing data migration from multiple on-premises sources. Strong collaboration with analysts and business process owners to translate business requirements into technical solutions. Proficiency in coding with scripting languages (Shell scripting, Python, SQL). Deep understanding and hands-on experience with Google Cloud Platform (GCP) technologies, especially in data migration and warehousing, including Database Migration Service (DMS), Cloud SQL, BigQuery, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage (GCS), IAM, Compute Engine, Cloud Data Fusion, and optionally Dataproc. Adherence to best development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, and writing clean, modular, self-sustaining code. Familiarity with CI/CD processes using GitHub, Cloud Build, and Google Cloud SDK. Qualifications: Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience. GCP Certified Data Engineer (preferred). Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to research scientists, engineering teams, and business audiences. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Responsibilities Job Description Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale databases, and databases involving big data processing. Work on data security and compliance, by implementing access controls, encryption, and compliance standards. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to a relational database system (e.g., PostgreSQL, MySQL) or cloud-based solutions like Google BigQuery. Develop import workflows and scripts to automate the data import process and ensure data accuracy and consistency. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Work with the team to ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and identify and resolve issues. Collaborate with the full-stack web developer in the team to support the implementation of efficient data access and retrieval mechanisms. Implement data security measures to protect sensitive information and comply with relevant regulations. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Familiarize yourself with tools and technologies used in the team's workflow, such as Knime for data integration and analysis. Use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Information Systems, or a related field is desirable but not essential. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting, aligning with the team’s data presentation goals. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of cloud-based databases, such as AWS RDS and Google BigQuery. Eagerness to develop import workflows and scripts to automate data import processes. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. Qualifications 5+ Years exp in Database Engineering. Additional Information Perks Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Demonstrated ability of Insurance Company Operations like Nonforfeiture option/ Face amount increase, decrease/ CVAT or GPT calculations /Dollar cost averaging and perform their respective transactions. Understanding on upstream and downstream interfaces for policy lifecycle Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Proficient in technology solution architecture, with a focus on designing innovative and effective solutions. Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

10.0 - 19.0 years

8 - 9 Lacs

Thiruvananthapuram

On-site

10 - 19 Years 10 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments: Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Skills scala,Python,Pyspark About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 1 week ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Demonstrated ability of Insurance Company Operations like Nonforfeiture option/ Face amount increase, decrease/ CVAT or GPT calculations /Dollar cost averaging and perform their respective transactions. Understanding on upstream and downstream interfaces for policy lifecycle Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Proficient in technology solution architecture, with a focus on designing innovative and effective solutions. Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Delhi

On-site

Job requisition ID :: 78129 Date: Jun 4, 2025 Location: Delhi Designation: Consultant Entity: Y our potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Analyst/Consultant/Senior Consultant in our T&T Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Design, develop and deploy solutions using different tools, design principles and conventions. Configure robotics processes and objects using core workflow principles in an efficient way; ensure they are easily maintainable and easy to understand. Understand existing processes and facilitate change requirements as part of a structured change control process. Solve day to day issues arising while running robotics processes and provide timely resolutions. Maintain proper documentation for the solutions, test procedures and scenarios during UAT and Production phase. Coordinate with process owners and business to understand the as-is process and design the automation process flow. Desired Qualifications Good hands-on experience in GCP services including Big Query, Cloud Storage, Dataflow, Cloud Datapost, Cloud Composer/Airflow, and IAM. Must have proficient experience in GCP Databases: Bigtable, Spanner, Cloud SQL and Alloy DB Proficiency either in SQL, Python, Java, or Scala for data processing and scripting. Experience in development and test automation processes through the CI/CD pipeline (Git, Jenkins, SonarQube, Artifactory, Docker containers) Experience in orchestrating data processing tasks using tools like Cloud Composer or Apache Airflow. Strong understanding of data modeling, data warehousing and big data processing concepts. Solid understanding and experience of relational database concepts and technologies such as SQL, MySQL, PostgreSQL or Oracle. Design and implement data migration strategies for various database types ( PostgreSQL, Oracle, Alloy DB etc.) Deep understanding of at least 1 Database type with ability to write complex SQLs. Experience with NoSQL databases such as MongoDB, Scylla, Cassandra, or DynamoDB is a plus. Optimize data pipelines for performance and cost-efficiency, adhering to GCP best practices. Implement data quality checks, data validation, and monitoring mechanisms to ensure data accuracy and integrity. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Ability to work independently and manage multiple priorities effectively. Preferably having expertise in end to end DW implementation. Location and way of working: Base location: Bangalore, Mumbai, Delhi, Pune, Hyderabad This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. Your role as a Analyst/Consultant/Senior Consultant: We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Analyst/Consultant/Senior Consultant across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation. Committed to creating purpose - Creating a sense of vision and purpose. Agile - Achieving high-quality results through collaboration and Team unity. Skilled at building diverse capability - Developing diverse capabilities for the future. Persuasive / Influencing - Persuading and influencing stakeholders. Collaborating - Partnering to build new solutions. Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities. Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization. Effective communication – Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities. Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems. Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyone's valued for who they are, use empathy to understand others to adapt our behaviours and attitudes to become more inclusive. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary The Therapeutic Data Strategy Director (TDSD) bridges science and operations by defining how the clinical data strategy is operationalized across the complete data flow within GCO. The TDSD is responsible for ensuring data regulatory compliance, the availability of End-to-End (E2E) standards, that instruments and devices are thoroughly discussed, defined, and finalized prior to the database build and that the operational impact of any new changes are known, , mitigated, and captured in the appropriate knowledge database. In collaboration with the GPT, GCT, and CTT, the TDSD aligns on the fit for purpose data package as part of a program / indication level quality by design to support data strategy needs in the drug development lifecycle of a molecule or across therapeutic area (TA) within an assigned unit in Novartis. This role creates and implements strategies for the end-to-end data product, ensuring application of Novartis clinical data standards and defining the clinical data acquisition and data review, strategy to support the submission of our clinical programs. The TDSD is responsible for ensuring that delivery and timelines are met with quality, whilst ensuring cost efficiencies and stakeholders’ satisfaction. About The Role Major Accountabilities (Describe the 10-14 main results of the role to be achieved) Creation And Execution Of Operational Data Strategy Collaborate with the Global Program Clinical Head (GPCH) to establish and maintain a data strategy for the clinical data standards of the assigned Therapeutic Area, as well as the design, collection, processing, transformation, of clinical data supporting the needs for reporting and submission. Impact assessment of proposed data collection and analysis Drive capability inputs to data team’s resource algorithm based on future incoming demands Matrix data operations leader who is the single focal point for the sustained industry leading cycle time for data product and ensures compliance with relevant Novartis processes Ensures the provision of resource with the skillset to develop robust & lean E2E specification. Leads the full spectrum of standard development and compliance across their portfolio. Consults to drive quality into the study protocol and operational processes. Driving implementation of a lean global data strategy and define minimum data requirements Ensure the minimum data requirements remain intact and understanding the operational impact e.g., resources, and time of any amendments as well as work with clinical development, analytics and regulatory line functions to understand the scientific, clinical, statistical and regulatory impacts. Support assessment on opportunity to capitalize on non-traditional options (e.g., historical data, synthetic data, cross-sponsor shared control arms, adaptive designs, pragmatic trials, decentralization, etc.). Work with COPH and Vendor Program Strategy Director (VPSD) to define the provision of ancillary data, including vendor capabilities. Author the Operational Data Strategy Section of Operational Execution Plan (OEP) (key customers, dataflow, and targets to generate Data-as-a-Product (DaaP) etc.). Establishes a “performance-oriented culture” that is driven / supported by analysis of real-time activity and quality metrics Contribute to the development of the Data Operations organization. Define/contribute to the development of long-term goals and operating policies through his/her leadership role on the management team. As an extended member of the Data Operations Leadership Team support functional excellence for Data Operations by contributing to the definition of the strategic goals and operating policies, and leading/contributing to strategic initiatives in line with the overall Data Operations strategy. Support the BD&L activities from CDO perspective. End-to-End Ownership Of The Clinical Data Flow Ensures that data is collected and reviewed as efficiently as possible, and that extraneous data is not procured. Drives implementation of a lean global data strategy and defines fit for purpose data quality requirements sufficient to support good decision making and meet regulatory requirements. Collaborates cross-functionally to define quality by design review process to ensure fit for purpose data quality sufficient to support good decision making. Accountable for managing operational strategy around data cleaning and data review at portfolio level. Drives standards and processes to facilitate data right the first time. Act as point of escalation for data specific project management issues and for broader data demands (e.g. changing scope, addition of analysis/reporting events). End-to-End Standards Oversight & Lifecycle Management Responsible for compliance with data requirements and the availability of end-to-end clinical data standards (data collection through analysis) for a program/molecule/indication. Influence and support the design of new clinical data standards as required at the enterprise/ therapeutic area level. Drives identification of needs, adoption and maintenance of data standards. Operational Project Management Develop, communicate, and drive implementation of a global data operationalization strategy to deliver value-adding data; CDS supports and guides the Data Team (as part of the CTT) in ensuring the overall program /OEP strategy is aligned with execution. Establish key customers of Clinical Data and establish approach for future consumption. Works with the business to ensure adherence to timelines, adoption of the data strategy and delivery of the target data product quality. Accountable for managing the strategy of the data cleaning, review, and data related specifications at portfolio and study level. Ensure high quality, timely and efficient Data Operation deliverables for projects and trials partnering with other Data Operations functions within assigned Development Unit or program. Work alongside the Operational Program Lead and Trial Lead to ensure all data related risks and issues are identified and mitigated. Link between business needs and technical development/deployment and technology usage in data operations. Influencer and interlocutor for adoption and compliance with company efficiency process and objectives within data workflow. Assesses / approves changes that impact the data collection strategy. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

About The Company Teikametrics' AI-powered Marketplace Optimization platform help sellers and brand owners to maximize their potential on the world's most valuable marketplaces. Founded in 2015, Teikametrics uses Proprietary AI technology to maximize profitability in a simple SaaS interface. Teikametrics optimizes more than $8 billion in GMV across thousands of sellers around the world, with brands including Munchkin, mDesign, Clarks, Nutribullet, Conair, Nutrafol, and Solo Stove trusting Teikametrics to unlock the full potential of their selling and advertising on Amazon, Walmart and other marketplaces. Teikametrics continues to grow exponentially, with teams spanning 3+ countries. We are financially strong, continuously meeting or exceeding revenue targets, and we invest heavily in strengthening the foundation of our organization. About The Role Teikametrics is looking for a Senior Software Engineer - Data Engineering with strong computer science fundamentals and a background in data engineering, API Integration, or large-scale data processing. This role involves designing, developing, and scaling robust data pipelines to process massive amounts of structured and unstructured data. The candidate will work closely with data scientists, analysts, and product engineers to deliver high-performance, scalable solutions. The architecture and stack evolve continuously as we scale to cater to an ever-growing customer base. Our technology stack includes Databricks, Spark (Scala), Kafka, AWS S3, and other distributed computing tools. How You'll Spend Your Time Design and implement highly scalable, fault-tolerant data pipelines for real-time and batch processing. Develop and optimize end-to-end Databricks Spark pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. Build and manage ETL (Extract, Transform, Load) processes to integrate data from diverse sources into our data ecosystem. Implement data validation, governance, and quality assurance mechanisms to ensure accuracy, completeness, and reliability. Collaborate with data scientists, ML engineers, and analysts to integrate AI/ML models into production environments, ensuring efficient data pipelines for training, deployment, and monitoring. Work with real-time data streaming solutions such as Kafka, Kinesis, or Flink to process and analyze event-driven data. Improve and optimize performance, scalability, and efficiency of data workflows and storage solutions. Document technical designs, workflows, and best practices to facilitate knowledge sharing and maintain system documentation. Who You Are 4+ years of experience as a professional software/data engineer, with a strong background in building large-scale distributed data processing systems. Experience with AI, machine learning, or data science concepts, including working on ML feature engineering, model training pipelines, or AI-driven data analytics. Hands-on experience with Apache Spark (Scala or Python) and Databricks. Experience with real-time data streaming technologies such as Kafka, Flink, Kinesis, or Dataflow. Proficiency in Java, Scala, or Python for building scalable data engineering solutions. Deep understanding of cloud-based architectures (AWS, GCP, or Azure) and experience with S3, Lambda, EMR, Glue, or Redshift. Experience in writing well-designed, testable, and scalable AI/ML data pipelines that can be efficiently reused and maintained with effective unit and integration testing. Strong understanding of data warehousing principles and best practices for optimizing large-scale ETL workflows. Experience with ML frameworks such as TensorFlow, PyTorch, or Scikit-learn. Optimize ML feature engineering and model training pipelines for scalability and efficiency. Knowledge of SQL and NoSQL databases for structured and unstructured data storage. Passion for collaborative development, continuous learning, and mentoring junior engineers. What Can Help You Stand Out Exposure to MLOps or Feature Stores for managing machine learning model data. Experience with data governance, compliance, and security best practices. Experience working in a fast-paced startup environment. WE'VE GOT YOU COVERED Every Teikametrics employee is eligible for company equity Remote Work – flexibility to work from home or from our offices + remote working allowance Broadband reimbursement Group Medical Insurance – Coverage of INR 7,50,000 per annum for a family Crèche benefit Training and development allowance Press Reference About Teika Teikametrics’ Marketplace Optimization Platform, Flywheel 2.0, Adds AI-Powered Automation to Maximize Advertising Performance Across Marketplaces The job description is representative of typical duties and responsibilities for the position and is not all-inclusive. Other duties and responsibilities may be assigned in accordance with business needs. We are proud to be an equal opportunity employer. A background check will be conducted after a conditional offer of employment is extended. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Demonstrated ability of Insurance Company Operations like Nonforfeiture option/ Face amount increase, decrease/ CVAT or GPT calculations /Dollar cost averaging and perform their respective transactions. Understanding on upstream and downstream interfaces for policy lifecycle Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Proficient in technology solution architecture, with a focus on designing innovative and effective solutions. Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Demonstrated ability of Insurance Company Operations like Nonforfeiture option/ Face amount increase, decrease/ CVAT or GPT calculations /Dollar cost averaging and perform their respective transactions. Understanding on upstream and downstream interfaces for policy lifecycle Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Proficient in technology solution architecture, with a focus on designing innovative and effective solutions. Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

Exploring Dataflow Jobs in India

The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi

These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.

Average Salary Range

The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.

Career Path

In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.

Related Skills

In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.

Interview Questions

  • What is dataflow and how is it different from data streaming? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How do you handle missing or null values in a dataset? (basic)
  • Can you explain the concept of data lineage? (medium)
  • What is the importance of data quality in dataflow processes? (basic)
  • How do you optimize dataflow pipelines for performance? (medium)
  • Describe a time when you had to troubleshoot a dataflow issue. (medium)
  • What are some common challenges faced in dataflow projects? (medium)
  • How do you ensure data security and compliance in dataflow processes? (medium)
  • What are the key components of a dataflow architecture? (medium)
  • Explain the concept of data partitioning in dataflow. (advanced)
  • How would you handle a sudden increase in data volume in a dataflow pipeline? (advanced)
  • What role does data governance play in dataflow processes? (medium)
  • Can you discuss the advantages and disadvantages of using cloud-based dataflow solutions? (medium)
  • How do you stay updated with the latest trends and technologies in dataflow? (basic)
  • What is the significance of metadata in dataflow management? (medium)
  • Walk us through a dataflow project you have worked on from start to finish. (medium)
  • How do you ensure data quality and consistency across different data sources in a dataflow pipeline? (medium)
  • What are some best practices for monitoring and troubleshooting dataflow pipelines? (medium)
  • How do you handle data transformations and aggregations in a dataflow process? (basic)
  • What are the key performance indicators you would track in a dataflow project? (medium)
  • How do you collaborate with cross-functional teams in a dataflow project? (basic)
  • Can you explain the concept of data replication in dataflow management? (advanced)
  • How do you approach data modeling in a dataflow project? (medium)
  • Describe a challenging dataflow problem you encountered and how you resolved it. (advanced)

Closing Remark

As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies