Home
Jobs
Companies
Resume

811 Teradata Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 years

5 - 6 Lacs

Indore

On-site

Indore, Madhya Pradesh, India Qualification : BTech degree in computer science, engineering or related field of study or 12+ years of related work experience 7+ years design & implementation experience with large scale data centric distributed applications Professional experience architecting, operating cloud-based solutions with good understanding of core disciplines like compute, networking, storage, security, databases etc. Good understanding of data engineering concepts like storage, governance, cataloging, data quality, data modeling etc. Good understanding about various architecture patterns like data lake, data lake house, data mesh etc. Good understanding of Data Warehousing concepts, hands-on experience working with tools like Hive, Redshift, Snowflake, Teradata etc. Experience migrating or transforming legacy customer solutions to the cloud. Experience working with services like AWS EMR, Glue, DMS, Kinesis, RDS, Redshift, Dynamo DB, Document DB, SNS, SQS, Lambda, EKS, Data Zone etc. Thorough understanding of Big Data ecosystem technologies like Hadoop, Spark, Hive, HBase etc. and other competent tools and technologies Understanding in designing analytical solutions leveraging AWS cognitive services like Textract, Comprehend, Rekognition etc. in combination with Sagemaker is good to have. Experience working with modern development workflows, such as git, continuous integration/continuous deployment pipelines, static code analysis tooling, infrastructure-as-code, and more. Experience with a programming or scripting language – Python/Java/Scala AWS Professional/Specialty certification or relevant cloud expertise Skills Required : AWS, Big Data, Spark, Technical Architecture Role : Drive innovation within Data Engineering domain by designing reusable and reliable accelerators, blueprints, and libraries. Capable of leading a technology team, inculcating innovative mindset and enable fast paced deliveries. Able to adapt to new technologies, learn quickly, and manage high ambiguity. Ability to work with business stakeholders, attend/drive various architectural, design and status calls with multiple stakeholders. Exhibit good presentation skills with a high degree of comfort speaking with executives, IT Management, and developers. Drive technology/software sales or pre-sales consulting discussions Ensure end-to-end ownership of all tasks being aligned. Ensure high quality software development with complete documentation and traceability. Fulfil organizational responsibilities (sharing knowledge & experience with other teams / groups) Conduct technical training(s)/session(s), write whitepapers/ case studies / blogs etc. Experience : 10 to 18 years Job Reference Number : 12895

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

The Data Engineer will work closely with clients and the eCS Biometrics team to optimize the elluminate® platform for end-to-end solutions to aggregate, transform, access and report on clinical data throughout the life cycle of a clinical trial. This includes study design in elluminate®, collaboration on specifications, and configuration of the various modules to including Data Central, Clinical Data Analytics and Trial Operational Analytics, Risk-Based Quality Management (RBQM), Statistical Computing Environment (SCE) and Operational Insights. The Data Engineer will be involved in standard ETL activities as well as programming custom listings, visualizations and analytics tools using Mapper and Qlik. The position involves a high level of quality control as well as adherence to standard operation procedures and work instructions and a constant drive towards automation and process improvement. Key Tasks & Responsibilities Design, develop, test, and deploy highly efficient code for supporting SDTM, Custom reports and Visualizations using tools like MS SQL, elluminate® Mapper and Qlik Configure ETL processes to support of the aggregation and standardization of clinical data from various sources including EDC systems, SAS and central laboratory vendors Work with Analytics developers, other team members and clients to review the business requirements and translate them into database objects and visualizations Manage multiple timelines and deliverables (for single or multiple clients) and managing client communications as assigned Provide diagnostic support and fix defects as needed Ensure compliance with eClinical Solutions/industry quality standards, regulations, guidelines, and procedures Other duties as assigned CANDIDATE’S PROFILE Education & Experience 3+ years of professional experience preferred Bachelor's degree or equivalent experience preferred Experience with database/warehouse architecture, design and development preferred Knowledge of various data platforms and warehouses including SQL Server, DB2, Teradata, AWS, Azure, Snowflake, etc. Understanding of Cloud / Hybrid data architecture concepts is a plus Knowledge of clinical trial data is a plus - CDISC ODM, SDTM, or ADAM standards Experience in Pharmaceutical/Biotechnology/Life Science industry is a plus Professional Skills Critical thinking, problem solving and strong initiative Communication and task management skills while working with technical and non-technical teams (both internal to eCS and clients) Must be team oriented with strong collaboration, prioritization, and adaptability skills Excellent knowledge of English; verbal and written communication skills with ability to interact with users and clients providing solutions Excited to learn new tools and product modules and adapt to changing technology and requirements Experience in the Life Sciences industry, CRO / Clinical Trial regulated environment preferred Technical Skills Proficient in SQL, T-SQL, PL/SQL programing Experience in Microsoft Office Applications, specifically MS Project and MS Excel Familiarity with multiple Database Platforms: Oracle, SQL Server, Teradata, DB2 Oracle Familiarity with Data Reporting Tools: QlikSense, QlikView, Spotfire, Tableau, JReview, Business Objects, Cognos, MicroStrategy, IBM DataStage, Informatica, Spark or related Familiarity with other languages and concepts: .NET, C#, Python, R, Java, HTML, SSRS, AWS, Azure, Spark, REST APIs, Big Data, ETL, Data Pipelines, Data Modelling, Data Analytics, BI, Data Warehouse, Data Lake or related Show more Show less

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

New Delhi, Chennai, Bengaluru

Hybrid

Naukri logo

Your day at NTT DATA We are seeking an experienced Data Engineer to join our team in delivering cutting-edge Generative AI (GenAI) solutions to clients. The successful candidate will be responsible for designing, developing, and deploying data pipelines and architectures that support the training, fine-tuning, and deployment of LLMs for various industries. This role requires strong technical expertise in data engineering, problem-solving skills, and the ability to work effectively with clients and internal teams. What youll be doing Key Responsibilities: Design, develop, and manage data pipelines and architectures to support GenAI model training, fine-tuning, and deployment Data Ingestion and Integration: Develop data ingestion frameworks to collect data from various sources, transform, and integrate it into a unified data platform for GenAI model training and deployment. GenAI Model Integration: Collaborate with data scientists to integrate GenAI models into production-ready applications, ensuring seamless model deployment, monitoring, and maintenance. Cloud Infrastructure Management: Design, implement, and manage cloud-based data infrastructure (e.g., AWS, GCP, Azure) to support large-scale GenAI workloads, ensuring cost-effectiveness, security, and compliance. Write scalable, readable, and maintainable code using object-oriented programming concepts in languages like Python, and utilize libraries like Hugging Face Transformers, PyTorch, or TensorFlow Performance Optimization: Optimize data pipelines, GenAI model performance, and infrastructure for scalability, efficiency, and cost-effectiveness. Data Security and Compliance: Ensure data security, privacy, and compliance with regulatory requirements (e.g., GDPR, HIPAA) across data pipelines and GenAI applications. Client Collaboration: Collaborate with clients to understand their GenAI needs, design solutions, and deliver high-quality data engineering services. Innovation and R&D: Stay up to date with the latest GenAI trends, technologies, and innovations, applying research and development skills to improve data engineering services. Knowledge Sharing: Share knowledge, best practices, and expertise with team members, contributing to the growth and development of the team. Bachelors degree in computer science, Engineering, or related fields (Masters recommended) Experience with vector databases (e.g., Pinecone, Weaviate, Faiss, Annoy) for efficient similarity search and storage of dense vectors in GenAI applications 5+ years of experience in data engineering, with a strong emphasis on cloud environments (AWS, GCP, Azure, or Cloud Native platforms) Proficiency in programming languages like SQL, Python, and PySpark Strong data architecture, data modeling, and data governance skills Experience with Big Data Platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg), Data Warehouses (Teradata, Snowflake, BigQuery), and lakehouses (Delta Lake, Apache Hudi) Knowledge of DevOps practices, including Git workflows and CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions) Experience with GenAI frameworks and tools (e.g., TensorFlow, PyTorch, Keras) Nice to have: Experience with containerization and orchestration tools like Docker and Kubernetes Integrate vector databases and implement similarity search techniques, with a focus on GraphRAG is a plus Familiarity with API gateway and service mesh architectures Experience with low latency/streaming, batch, and micro-batch processing Familiarity with Linux-based operating systems and REST APIs

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our Company Teradata empowers companies to achieve high-impact business outcomes through analytics. With a powerful combination of Industry expertise and leading hybrid cloud technologies for data warehousing and big data analytics, Teradata unleashes the potential of great companies. Partnering with top companies around the world, Teradata helps improve customer experience, mitigate risk, drive product innovation, achieve operational excellence, transform finance, and optimize assets. Teradata is recognized by media and industry analysts as a future-focused company for its technological excellence, sustainability, ethics, and business value. The Teradata culture isn’t just about one kind of person. So many individuals make up who we are, making us that much more unique. It’s what sets apart the dynamic, diverse and collaborative environment that is Teradata. But even as individuals, there’s one thing that we all share —our united goal of making Teradata and our people, the best we can be. Who You’ll Work With Teradata Labs is where cutting-edge innovations in data management turn into business value. Our outstanding team of database architects and software engineers work together to understand and advance emerging technologies to produce the next wave of big data analytic solutions. Teradata Database is the core of Teradata Massively Parallel Processing (MPP) systems that run on-premises and in hybrid clouds to manage and optimize sophisticated workloads. The heart of Teradata Database is its cloud-based best-in-class query optimization engine. We work on query optimization techniques in database and analytics engines, machine learning algorithms, scalability and elasticity issues in the cloud, and many other exciting challenges related to performance, usability, accessibility and integration. What You’ll Do The Database Query Optimization group at Teradata Labs has an opening for Staff Software Engineer. In this role, you are expected to contribute to the design, development, and testing of new enhancements and advanced features for the Teradata Vantage Core Platform. Responsible for all phases of agile software development life cycle from software design through customer support Candidate should have the skills to research and establish technical direction for complex feature development, and perform functional and performance problem analysis. As needed, candidate must be able to perform competitive analysis of competing database management systems and data integration solutions, and provide recommendations on Teradata offering changes to close competitive gaps and enhance competitive advantages Design, implement, validate, and test new database and novel query optimization features in an Agile form, and perform functional and performance analysis of code defects and correction of the defects Contribute to the delivery and continuous support of robust, resilient, and quality database products Lead and establish technical direction for a group of software engineers during feature development Help feature manager with technical aspects of features and projects including plan, track and provide status on large projects What Makes You a Qualified Candidate Bachelor’s Degree in Computer Science(B. Tech) or related discipline, with at least ten years of related research or industry, or Master’s Degree in Computer Science(M. Tech/MCA) or related discipline, with at least eight years of related research or industry experience, or Ph. D. in Computer Science or related discipline, with at least five years of related research or industry experience Technical leadership in composing very complex and visionary idea in cloud-based data management specifically query processing and optimization What You’ll Bring Familiarity with various database technologies Deep understanding of Amazon Web Services (AWS) / Public Cloud technologies and operations Demonstrated design skills for large scale, elastic and highly available cloud database services or distributed systems Top-notch programming skills in C++, Java, Python, R, SQL Computer Science fundamentals in object-oriented design, design patterns, and test driver development System development experience Debugging with complex software in a parallel processing environment Passionate, self-motivated, risk taker, pro-active, initiative taker, good communicator (written & verbal), creative, and team-oriented Experience using Agile software development methods and tools Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. ​ We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Description An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise ∙ They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes ∙ Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise ∙ They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes ∙ Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Job Description An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise ∙ They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes ∙ Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Job Description An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise ∙ They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes ∙ Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description An ETL Tester (4+ Years Must) is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems. These target systems can be On Cloud or on Premise ∙ They work closely with ETL developers, Data engineers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes ∙ Understand Cloud architecture and design test strategies for data moving in and out of Cloud systems Roles And Responsibilities Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle/ SQL Server/ Teradata / Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in Automation testing and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Skills: rally,agile environment,automation testing,data validation,jira,etl and bi,hp alm,etl testing,test strategy,data warehouse,data integration testing,test case creation,python,oracle/ sql server/ teradata / snowflake,sql,data warehouse testing,database knowledge,test data maintenance Show more Show less

Posted 1 week ago

Apply

4.0 - 8.0 years

5 - 17 Lacs

Noida, Uttar Pradesh, India

On-site

Foundit logo

Position Title- Data Architect / Solution Architect Location: Pan india This position description should represent your role and responsibilities at the time of appointment, however due to the dynamic nature of our business, your job title, key tasks and responsibilities are likely to evolve over time. The flexibility to adapt to any changes should be considered a key requirement of working at TPG Telecom. Role Purpose & Environment In this role you will work hand-in-hand with various technology and business stakeholders to design and build TPGs modern data platform in the cloud and manage the legacy applications. You will provide strategic direction and leadership guidance driving architecture and implementation initiatives leveraging your knowledge and experience in the area. The role also extends into the consumption side of data and will allow you to deliver business intelligence capabilities (including Advanced Analytics) and strategies for information delivery and data exploration to support business objectives and requirements. We are seeking someone with the passion for understanding and leveraging data, with the attitude and behaviour to deliver on commitments and take ownership of data products when required. Key Responsibilities Define and design the overall data architecture, strategy, and data capabilities roadmap that are consistent with our technology direction. Define and design the data platforms, tools and governing process Create, maintain and communicate go-forward strategies for business intelligence capabilities and tools. Responsible and accountable for producing the data solution and data product architecture design ensuring that they are submitted and progress via the prescribed governance process through to approval (ARB) in a timely manner aligned with project prescribed timelines. Define and review data solutions for re-usability, scalability, synergy opportunities and alignment to defined best practice and guidelines Create and evolve data technology roadmap, to align with continuously evolving business needs. Help defining and improving best practices, guidelines, and integration with other enterprise solutions. Participates in planning, dependency identification, and management as well as estimation with Project Managers. Leads Work Breakdown identification and workshops utilising Architecture designs as input Demonstrated grasp of Architecture techniques and ability to work effectively with senior business stakeholders and initiative owners. Act as Technology advisors on data to business leaders and strategic leaders on technology direction. Key Experience, Skills, and Qualifications Domain Expertise 7 years+ of professional experience in data architecture or data engineering role. Demonstrating a high degree of proficiency in designing and developing complex, high quality data solutions according to our architecture governance policies guidelines. Strong experience in developing and maintaining data warehouses (e.g., Redshift, Teradata) Able to work independently and develop the solution architecture according to the business requirements and compliance requirements. Strong Data warehouse development experience using different ETL tools (e.g. SAS DI, Glue, DBT) Experience with data streaming platforms (e.g., Kafka/Kinesis) Familiarity with different operational orchestration platforms (e.g., Airflow, LSF scheduler etc) Experience with data catalogue and data governance tools. Understanding of CLDM, Star Schema, Data Mesh and Data Product concepts Exposure to machine learning, reporting, data sharing, data intensive application-oriented use cases Extensive experience in consulting with business stakeholders and other user groups to deliver both strategic and tactical information management solution. Experience working within matrix structures, with demonstrated ability to broker outcomes effectively and collaboratively with colleagues and peers. Experience on different delivery methodologies (e.g., Waterfall, Agile) Telecommunication Industry experience Bachelor's degree in computer science, computer programming or related field preferred Individual Skills, Mindset & Behaviours Strong communication skills with ability to communicate complex technical concepts in a digestible way Ability to effortlessly switch gears from summary view for leadership to hands-on discussion with practitioners Assertive, with the confidence to be voice of authority what is best for team High-energy and passionate outlook to the role and can influence those around her/him Ability to build a sense of trust and rapport that creates a comfortable, respectful, and effective workplace

Posted 1 week ago

Apply

4.0 - 8.0 years

5 - 17 Lacs

Thane, Maharashtra, India

On-site

Foundit logo

Position Title- Data Architect / Solution Architect Location: Pan india This position description should represent your role and responsibilities at the time of appointment, however due to the dynamic nature of our business, your job title, key tasks and responsibilities are likely to evolve over time. The flexibility to adapt to any changes should be considered a key requirement of working at TPG Telecom. Role Purpose & Environment In this role you will work hand-in-hand with various technology and business stakeholders to design and build TPGs modern data platform in the cloud and manage the legacy applications. You will provide strategic direction and leadership guidance driving architecture and implementation initiatives leveraging your knowledge and experience in the area. The role also extends into the consumption side of data and will allow you to deliver business intelligence capabilities (including Advanced Analytics) and strategies for information delivery and data exploration to support business objectives and requirements. We are seeking someone with the passion for understanding and leveraging data, with the attitude and behaviour to deliver on commitments and take ownership of data products when required. Key Responsibilities Define and design the overall data architecture, strategy, and data capabilities roadmap that are consistent with our technology direction. Define and design the data platforms, tools and governing process Create, maintain and communicate go-forward strategies for business intelligence capabilities and tools. Responsible and accountable for producing the data solution and data product architecture design ensuring that they are submitted and progress via the prescribed governance process through to approval (ARB) in a timely manner aligned with project prescribed timelines. Define and review data solutions for re-usability, scalability, synergy opportunities and alignment to defined best practice and guidelines Create and evolve data technology roadmap, to align with continuously evolving business needs. Help defining and improving best practices, guidelines, and integration with other enterprise solutions. Participates in planning, dependency identification, and management as well as estimation with Project Managers. Leads Work Breakdown identification and workshops utilising Architecture designs as input Demonstrated grasp of Architecture techniques and ability to work effectively with senior business stakeholders and initiative owners. Act as Technology advisors on data to business leaders and strategic leaders on technology direction. Key Experience, Skills, and Qualifications Domain Expertise 7 years+ of professional experience in data architecture or data engineering role. Demonstrating a high degree of proficiency in designing and developing complex, high quality data solutions according to our architecture governance policies guidelines. Strong experience in developing and maintaining data warehouses (e.g., Redshift, Teradata) Able to work independently and develop the solution architecture according to the business requirements and compliance requirements. Strong Data warehouse development experience using different ETL tools (e.g. SAS DI, Glue, DBT) Experience with data streaming platforms (e.g., Kafka/Kinesis) Familiarity with different operational orchestration platforms (e.g., Airflow, LSF scheduler etc) Experience with data catalogue and data governance tools. Understanding of CLDM, Star Schema, Data Mesh and Data Product concepts Exposure to machine learning, reporting, data sharing, data intensive application-oriented use cases Extensive experience in consulting with business stakeholders and other user groups to deliver both strategic and tactical information management solution. Experience working within matrix structures, with demonstrated ability to broker outcomes effectively and collaboratively with colleagues and peers. Experience on different delivery methodologies (e.g., Waterfall, Agile) Telecommunication Industry experience Bachelor's degree in computer science, computer programming or related field preferred Individual Skills, Mindset & Behaviours Strong communication skills with ability to communicate complex technical concepts in a digestible way Ability to effortlessly switch gears from summary view for leadership to hands-on discussion with practitioners Assertive, with the confidence to be voice of authority what is best for team High-energy and passionate outlook to the role and can influence those around her/him Ability to build a sense of trust and rapport that creates a comfortable, respectful, and effective workplace

Posted 1 week ago

Apply

0.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Indeed logo

Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for designing, building and overseeing the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Work with data modelers/analysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. May lead teams or projects and shares expertise. Job Description Position: Data Engineer 4 Experience: 8 years to 11.5 years Job Location: Chennai Tamil Nadu Job Description: Requirements Databases: Deep knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). ( MUST) Big Data Technologies: Experience with, Spark, Kafka, and other big data ecosystem tools. (NICE TO HAVE) Cloud Platforms: Experience with cloud services such as AWS, Azure, or Google Cloud Platform, with a particular focus on data engineering services . ( NICE TO HAVE ) Version Control: Experience with version control systems like Git. ( MUST) CI/CD: Knowledge of CI/CD pipelines for automating development and deployment processes. (MUST) Proficiency in Elasticsearch and experience managing large-scale clusters. ( MUST) Hands-on experience with containerization technologies like Docker and Kubernetes. (MUST docker) Strong programming skills in scripting languages such as Python, Bash, or similar. ( NICE to HAVE) Key Responsibilities Design, develop, and maintain scalable data pipelines and infrastructure. Ensure compliance with security regulations and implement advanced security measures to protect company data. Implement and manage CI/CD pipelines for data applications. Work with containerization technologies (Docker, Kubernetes) to deploy and manage data services. Optimize and manage Elasticsearch clusters for log ingestion , and tools such as Logstash, fluent.d , promtail, used to forward logs to Elastic Istance or other Log ingestion Tool . ( Loki+ Grafana). Collaborate with other departments (e.g., Data Science, IT, DevOps) to integrate data solutions with existing business systems. Optimize the performance of data pipelines and resolve data integrity and quality issues. Document data processes and architectures to ensure transparency and facilitate maintenance. Monitor industry trends and adopt best practices to continuously improve our data engineering solutions. Core Responsibilities Develops data structures and pipelines aligned to established standards and guidelines to organize, collect, standardize and transform data that helps generate insights and address reporting needs. Focuses on ensuring data quality during ingest, processing as well as final load to the target tables. Creates standard ingestion frameworks for structured and unstructured data as well as checking and reporting on the quality of the data being processed. Creates standard methods for end users / downstream applications to consume data including but not limited to database views, extracts and Application Programming Interfaces. Develops and maintains information systems (e.g., data warehouses, data lakes) including data access Application Programming Interfaces. Participates in the implementation of solutions via data architecture, data engineering, or data manipulation on both on-prem platforms like Kubernetes and Teradata as well as Cloud platforms like Databricks. Determines the appropriate storage platform across different on-prem (minIO and Teradata) and Cloud (AWS S3, Redshift) depending on the privacy, access and sensitivity requirements. Understands the data lineage from source to the final semantic layer along with the transformation rules applied to enable faster troubleshooting and impact analysis during changes. Collaborates with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality as well as process optimization. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Develops strategies for data acquisition, archive recovery, and database implementation. Manages data migrations/conversions and troubleshooting data processing issues. Understands the data sensitivity, customer data privacy rules and regulations and applies them consistently in all Information Lifecycle Management activities. Identifies and reacts to system notification and log to ensure quality standards for databases and applications. Solves abstract problems beyond single development language or situation by reusing data file and flags already set. Solves critical issues and shares knowledge such as trends, aggregate, quantity volume regarding specific data sources. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer:This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 7-10 Years

Posted 1 week ago

Apply

0.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Indeed logo

Overview Our analysts transform data into meaningful insights that drive strategic decision making. They analyze trends, interpret data, and discover opportunities. Working cross-functionally, they craft narratives from the numbers - directly contributing to our success. Their work influences key business decisions and shape the direction of Comcast. Success Profile What makes a successful Data Engineer 4 at Comcast? Check out these top traits and explore role-specific skills in the job description below. Good Listener Problem Solver Organized Collaborative Perceptive Analytical Benefits We’re proud to offer comprehensive benefits to help support you physically, financially and emotionally through the big milestones and in your everyday life. Paid Time off We know how important it can be to spend time away from work to relax, recover from illness, or take time to care for others needs. Physical Wellbeing We offer a range of benefits and support programs to ensure that you and your loved ones get the care you need. Financial Wellbeing These benefits give you personalized support designed entirely around your unique needs today and for the future. Emotional Wellbeing No matter how you’re feeling or what you’re dealing with, there are benefits to help when you need it, in the way that works for you. Life Events + Family Support Benefits that support you no matter where you are in life’s journey. Data Engineer 4 Location Chennai, India Req ID R412866 Job Type Full Time Category Analytics Date posted 06/10/2025 Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for designing, building and overseeing the deployment and operation of technology architecture, solutions and software to capture, manage, store and utilize structured and unstructured data from internal and external sources. Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required. Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Work with data modelers/analysts to understand the business problems they are trying to solve then create or augment data assets to feed their analysis. Integrates knowledge of business and functional priorities. Acts as a key contributor in a complex and crucial environment. May lead teams or projects and shares expertise. Job Description Position: Data Engineer 4 Experience: 8 years to 11.5 years Job Location: Chennai Tamil Nadu Job Description: Requirements Databases: Deep knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). ( MUST) Big Data Technologies: Experience with, Spark, Kafka, and other big data ecosystem tools. (NICE TO HAVE) Cloud Platforms: Experience with cloud services such as AWS, Azure, or Google Cloud Platform, with a particular focus on data engineering services . ( NICE TO HAVE ) Version Control: Experience with version control systems like Git. ( MUST) CI/CD: Knowledge of CI/CD pipelines for automating development and deployment processes. (MUST) Proficiency in Elasticsearch and experience managing large-scale clusters. ( MUST) Hands-on experience with containerization technologies like Docker and Kubernetes. (MUST docker) Strong programming skills in scripting languages such as Python, Bash, or similar. ( NICE to HAVE) Key Responsibilities Design, develop, and maintain scalable data pipelines and infrastructure. Ensure compliance with security regulations and implement advanced security measures to protect company data. Implement and manage CI/CD pipelines for data applications. Work with containerization technologies (Docker, Kubernetes) to deploy and manage data services. Optimize and manage Elasticsearch clusters for log ingestion , and tools such as Logstash, fluent.d , promtail, used to forward logs to Elastic Istance or other Log ingestion Tool . ( Loki+ Grafana). Collaborate with other departments (e.g., Data Science, IT, DevOps) to integrate data solutions with existing business systems. Optimize the performance of data pipelines and resolve data integrity and quality issues. Document data processes and architectures to ensure transparency and facilitate maintenance. Monitor industry trends and adopt best practices to continuously improve our data engineering solutions. Core Responsibilities Develops data structures and pipelines aligned to established standards and guidelines to organize, collect, standardize and transform data that helps generate insights and address reporting needs. Focuses on ensuring data quality during ingest, processing as well as final load to the target tables. Creates standard ingestion frameworks for structured and unstructured data as well as checking and reporting on the quality of the data being processed. Creates standard methods for end users / downstream applications to consume data including but not limited to database views, extracts and Application Programming Interfaces. Develops and maintains information systems (e.g., data warehouses, data lakes) including data access Application Programming Interfaces. Participates in the implementation of solutions via data architecture, data engineering, or data manipulation on both on-prem platforms like Kubernetes and Teradata as well as Cloud platforms like Databricks. Determines the appropriate storage platform across different on-prem (minIO and Teradata) and Cloud (AWS S3, Redshift) depending on the privacy, access and sensitivity requirements. Understands the data lineage from source to the final semantic layer along with the transformation rules applied to enable faster troubleshooting and impact analysis during changes. Collaborates with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality as well as process optimization. Creates and establishes design standards and assurance processes for software, systems and applications development to ensure compatibility and operability of data connections, flows and storage requirements. Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs. Develops strategies for data acquisition, archive recovery, and database implementation. Manages data migrations/conversions and troubleshooting data processing issues. Understands the data sensitivity, customer data privacy rules and regulations and applies them consistently in all Information Lifecycle Management activities. Identifies and reacts to system notification and log to ensure quality standards for databases and applications. Solves abstract problems beyond single development language or situation by reusing data file and flags already set. Solves critical issues and shares knowledge such as trends, aggregate, quantity volume regarding specific data sources. Consistent exercise of independent judgment and discretion in matters of significance. Regular, consistent and punctual attendance. Must be able to work nights and weekends, variable schedule(s) as necessary. Other duties and responsibilities as assigned. Employees at all levels are expected to: Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer:This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 7-10 Years

Posted 1 week ago

Apply

15.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description: About us* At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* GF (Global Finance) Global Financial Control India (GFCI) is part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business (LOBs) and Enterprise Finance functions. The capabilities hosted include General Accounting & Reconciliations, Legal Entity Controllership, Corporate Sustainability Controllership, Corporate Controllership, Management Reporting & Analysis, Finance Systems Support, Operational Risk and Controls, Regulatory Reporting and Strategic initiatives. The Financed Emissions Accounting & Reporting team, a part of the Global Financial Control-Corporate Sustainability Controller organization within the CFO Group, plays a critical role in supporting the calculation of asset level balance sheet Financed Emissions, which are integral to the Bank ’s goal of achieving Net-zero greenhouse gas emissions by 2050. Job Description* The role is responsible for building data sourcing process, data research and analytics using available tools, support model input data monitoring and develop necessary data or reporting frameworks to support our approaches to net zero progress alignment, target setting, client engagement and reputational risk review, empowering banking teams to assist clients on net zero financing strategies and specific commercial opportunities. The role will support and partner with business stakeholders in the Enterprise Climate Program Office, Technology, Climate and Credit Risk, the Global Environment Group, Lines of Business, Legal Entity Controllers and Model Risk Management. Additionally, the role will support data governance, lineage, controls by building, improving and executing data processes. Candidate must be able to communicate across technology partners, climate office and the business lines to execute on viable analytical solutions, with a focus on end-user experience and usability. Candidate must be strong in identifying and explaining data quality issues to help achieve successful and validated data for model execution. This individual should feel at ease creating complex SQL queries and extracting large, raw datasets from various sources, merging, and transforming raw data into usable data and analytic structures, and benchmarking results to known. They must feel comfortable with automating repeatable process, generating data insights that are easy for end users to interpret, conduct quantitative analysis, as well as effectively communicate and disseminate findings and data points to stakeholders. They should also understand greenhouse gas accounting frameworks and financed emissions calculations as applied to different sectors and asset classes. The candidate will have experience representing ERA with critical Climate stakeholders across the firm, and should demonstrate capacity for strategic leadership, exercising significant independent judgment and discretion and work towards strategic goals with limited oversight. Responsibilities* Net zero transition planning and execution: Partners with GEG, Program Office and Lines of Business in developing and executing enterprise-wide net zero transition plan and operational roadmap, with a focus on analysis and reporting capabilities, data procurement, liaising with consultants, external data providers, Climate Risk and Technology functions. Data development & Operations: Research on data requirements, produce executive level and detailed level data summary, validate the accuracy, completeness, reasonableness, timeliness on the dataset and develop desktop procedures for BAU operations. Perform data review and test technology implementation for financed emissions deliverables. Execute BAU processes such as new data cycle creation, execute data controls and data quality processes. Produce data summary materials and walk through with leadership team. Data Analytics & Strategy: Analyze the data and provide how granular data movements across history affects the data new results. Find trends of data improvements or areas for improvement. Develops automated data analysis results and answer the common questions to justify the changes in data. Support ad hoc analytics of bank-wide and client net zero commitment implementation, with an initial focus on automation of financed emissions analysis, reporting against PCAF standards and net zero transition preparedness analytics and engagement to enhance strategy for meeting emissions goals for target sectors. Requirements* Education* Bachelor’s degree in data management or analytics, engineering, sustainability, finance or other related field OR master’s degree in data science, earth/climate sciences, engineering, sustainability, natural resource management, environmental economics, finance or other related field Certifications If Any NA Experience Range* Minimum 15+ years in Climate, Financed Emissions, finance, financial reporting Three (3) or more years of experience in statistical and/or data management and analytics and visualization (intersection with financial services strongly preferred) Foundational skills* Deep expertise in SQL, Excel, automation & optimization, and project management Knowledge of data architecture concepts, data models, ETL processes Deep understanding of how data process works and ability to solve dynamically evolving and complex data challenges part of day-to-day activities. Knowledge of data architecture concepts, data models, ETL processes Deep understanding of how data process works and ability to solve dynamically evolving and complex data challenges part of day-to-day activities. Experience in extracting, and combining data across from multiple sources, and aggregate data to support model development. Experience in multiple database environment such as Oracle, Hadoop, and Teradata Deep expertise in SQL, Excel, Python, automation & optimization, and project management Strong technical and visualization skills, with the ability to understand the business goals, needs, and be committed to delivering recommendations that will guide strategic decisions. Knowledge on Alteryx, Tableau, R, (knowledge of NLP, data scraping and generative AI welcome) Strong leadership skills and proven ability in motivating employees and promoting teamwork. Excellent interpersonal, management, and teamwork skills. High level of significant independent decision-making ability. Highly motivated self-starter with excellent time management skills and the ability to effectively manage multiple priorities and timelines. Demonstrated ability to motivate others in a high-stress environment to achieve goal. Ability to effectively communicate and resolve conflicts by both oral and written communication to both internal and external clients. Ability to adapt to a dynamic and evolving work environment. Well-developed analytical and problem-solving skills. Experience and knowledge of the principles and practices of management and employee development. Ability to think critically to solve problems with rational solutions. Ability to react and make decisions quickly under pressure with good judgment. Strong documentation & presentation skills to explain the data analysis in a visual and procedural way based on the audience. Ability to quickly identify risks and determine reasonable solutions. Desired Skills Advanced knowledge of Finance Advanced knowledge of Climate Risk Work Timings* Window 12:30 PM to 9:30 PM (9 hours shift, may require stretch during peak period) Job Location* Mumbai Show more Show less

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

What you will do: Plan and release the developments in different environments Carry out prototypes, unit tests and performance tests and make sure that they respect the technical specs Take charge of the production monitoring and the related incidents Maintenance of scripts (Shell and Perl) Prepare technical documentation Automatization and industrialization of: developments, release of technical and functional components in different environments, infrastructure monitoring In charge of the technical analysis and of proposing technical solutions Follow the quality standards imposed by the project Use standard or innovative functionalities of the product in collaboration with the business lines as well as with the IT teams Use Microstrategy on TeraData to ensure optimum performance Use Microstrategy with Big Data technologies Offer Microstrategy expertise throughout developments Code review and ensure that the Microstrategy is well used Analyze and correct the incidents in prod Profile Required: Minimum 2 years experience Knowledge in a Microstrategy BI environment (development) Knowledge of Linux working environment Knowledge on Teradata/PostgreSQL databases Experience working in an Agile environment Team spirit, integrity and autonomy

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Job Description: Business Title Lead Technical Architect Years of Experience > 7 Years Must have skills 1. Database ( SQL server / SnowFlake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc) 2. ETL tool (Talend, Informatica, IICS (Informatica cloud) ) 3. Experience in Cloud computing (one or more of AWS, Azure, GCP) 4. Python, UNIX shell scripting, Project & resource management 5. SVN, JIRA, Automation workflow (Apache Airflow, Tidal, Tivoli or similar) Good to have skills 1. PySpark, Big Query, Familiar with NoSQL such as MongoDB etc 2. Client-facing skills Jod Descreption The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day to day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after sales support and best practise advice. Interactions with internal stakeholder s and clients to explain technology solutions and a clear understanding of client s business requirements through which to guide optimal design to meet their needs. Key responsibiltes Ability to design simple to medium data solutions for clients by using cloud architecture using AWS/GCP Strong understanding of DW, data mart, data modelling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modelling, data structures, databases, and ETL processes Strong understand of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain detailed design to team and create low level to high level design Create technical documents for ETL and SQL developments using Visio, PowerPoint and other MS Office package Will need to engage with Project Managers, Business Analysts and Application DBA to implement ETL Solutions Perform mid to complex level tasks independently Support Client, Data Scientists and Analytical Consultants working on marketing solution Work with cross functional internal team and external clients Strong project Management and organization skills. Ability to lead 1 - 2 projects of team size 2 - 3 team members. Code management systems which includes Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/ validate proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team Education Qulification 1. Bachelor s or Master Degree in Computer Science Shift timingGMT (UK Shift) - 2 PM to 11 PM Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Job Description: Business Title Lead Technical Architect Years of Experience > 7 Years Must have skills 1. Database ( SQL server / SnowFlake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc) 2. ETL tool (Talend, Informatica, IICS (Informatica cloud) ) 3. Experience in Cloud computing (one or more of AWS, Azure, GCP) 4. Python, UNIX shell scripting, Project & resource management 5. SVN, JIRA, Automation workflow (Apache Airflow, Tidal, Tivoli or similar) Good to have skills 1. PySpark, Big Query, Familiar with NoSQL such as MongoDB etc 2. Client-facing skills Job Descreption The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day to day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after sales support and best practise advice. Interactions with internal stakeholder s and clients to explain technology solutions and a clear understanding of client s business requirements through which to guide optimal design to meet their needs. Key responsibiltes Ability to design simple to medium data solutions for clients by using cloud architecture using AWS/GCP Strong understanding of DW, data mart, data modelling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modelling, data structures, databases, and ETL processes Strong understand of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain detailed design to team and create low level to high level design Create technical documents for ETL and SQL developments using Visio, PowerPoint and other MS Office package Will need to engage with Project Managers, Business Analysts and Application DBA to implement ETL Solutions Perform mid to complex level tasks independently Support Client, Data Scientists and Analytical Consultants working on marketing solution Work with cross functional internal team and external clients Strong project Management and organization skills. Ability to lead 1 - 2 projects of team size 2 - 3 team members. Code management systems which includes Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/ validate proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team Education Qualification Bachelor s or Master Degree in Computer Science Shift timing GMT (UK Shift) - 2 PM to 11 PM Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 1 week ago

Apply

9.0 - 14.0 years

20 - 25 Lacs

Mumbai

Work from Office

Naukri logo

Job Description: Business Title Lead Technical Architect Years of Experience > 7 Years Must have skills 1. Database ( SQL server / SnowFlake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc) 2. ETL tool (Talend, Informatica, IICS (Informatica cloud) ) 3. Experience in Cloud computing (one or more of AWS, Azure, GCP) 4. Python, UNIX shell scripting, Project & resource management 5. SVN, JIRA, Automation workflow (Apache Airflow, Tidal, Tivoli or similar) Good to have skills 1. PySpark, Big Query, Familiar with NoSQL such as MongoDB etc 2. Client-facing skills Jod Descreption The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day to day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after sales support and best practise advice. Interactions with internal stakeholder s and clients to explain technology solutions and a clear understanding of client s business requirements through which to guide optimal design to meet their needs. Key responsibiltes Ability to design simple to medium data solutions for clients by using cloud architecture using AWS/GCP Strong understanding of DW, data mart, data modelling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modelling, data structures, databases, and ETL processes Strong understand of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain detailed design to team and create low level to high level design Create technical documents for ETL and SQL developments using Visio, PowerPoint and other MS Office package Will need to engage with Project Managers, Business Analysts and Application DBA to implement ETL Solutions Perform mid to complex level tasks independently Support Client, Data Scientists and Analytical Consultants working on marketing solution Work with cross functional internal team and external clients Strong project Management and organization skills. Ability to lead 1 - 2 projects of team size 2 - 3 team members. Code management systems which includes Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/ validate proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team Education Qulification 1. Bachelor s or Master Degree in Computer Science Shift timingGMT (UK Shift) - 2 PM to 11 PM Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 1 week ago

Apply

5.0 - 9.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes Optimize data flow and collection for cross-functional teams Build infrastructure required for optimal extraction, transformation, and loading of data Ensure data quality, reliability, and integrity across all data systems Collaborate with data scientists and analysts to help implement models and algorithms Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, etc. Create and maintain comprehensive technical documentation Evaluate and integrate new data management technologies and tools Requirements 3-5 years of professional experience in data engineering roles Bachelors degree in Computer Science, Engineering, or related field; Masters degree preferred Job Description Expert knowledge of SQL and experience with relational databases (e.g., PostgreSQL, Redshift, TIDB, MySQL, Oracle, Teradata) Extensive experience with big data technologies (e.g., Hadoop, Spark, Hive, Flink) Proficiency in at least one programming language such as Python, Java, or Scala Experience with data modeling, data warehousing, and building ETL pipelines Strong knowledge of data pipeline and workflow management tools (e.g., Airflow, Luigi, NiFi) Experience with cloud platforms (AWS, Azure, or GCP) and their data services. AWS Preferred Hands on Experience with building streaming pipelines with flink, Kafka, Kinesis. Flink Understanding of data governance and data security principles Experience with version control systems (e.g., Git) and CI/CD practices Preferred Skills Experience with containerization and orchestration tools (Docker, Kubernetes) Basic knowledge of machine learning workflows and MLOps Experience with NoSQL databases (MongoDB, Cassandra, etc.) Familiarity with data visualization tools (Tableau, Power BI, etc.) Experience with real-time data processing Knowledge of data governance frameworks and compliance requirements (GDPR, CCPA, etc.) Experience with infrastructure-as-code tools (Terraform, CloudFormation) Personal Qualities Strong problem-solving skills and attention to detail Excellent communication skills, both written and verbal Ability to work independently and as part of a team Proactive approach to identifying and solving problems

Posted 1 week ago

Apply

5.0 - 8.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Software Engineering Lead Analyst - HIH -Evernorth About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Job Description & Responsibilities: Work with business and technical leadership to understand requirements Design to the requirements and document the designs Ability to write product-grade performant code for data extraction, transformations and loading using Spark, Py-Spark Do data modeling as needed for the requirements Write performant queries using Teradata SQL, and Spark SQL against Teradata and Implementing dev-ops pipelines to deploy code artifacts on to the designated platform/servers like AWS(pref) or Implement Hadoop job orchestration using Shell scripting, CA7 Enterprise Scheduler and Airflow Troubleshooting the issues, providing effective solutions and jobs monitoring in the production environment Participate in sprint planning sessions, refinement/story-grooming sessions, daily scrums, demos and retrospectives Experience Required: Overall 5 - 8 Years Experience Desired: Experience in Jira and Confluence Health care domain knowledge is a plus Excellent work experience on Databricks OR Teradata as data warehouse Experience in Agile and working knowledge on DevOps tools Education and Training Required: Primary Skills: Spark, Py-Spark, Shell scripting, Teradata SQLs (using Teradata SQLand Spark SQL) and Stored Procedures Git, Jenkins, Artifactory CA7 Enterprise Scheduler, Airflow AWS data services(S3, EC2, SQS) AWS Services- SNS Lambda, ECS, Glue, IAM CloudWatch Monitoring tool Databricks (Delta lake, Notebooks, Pipelines, cluster management, Azure/AWS integration) Good to have: Unix / Linux Shell scripting (KSH) and basic administration of Unix servers Additional Skills: Exercises considerable creativity, foresight, and judgment in conceiving, planning, and delivering initiatives.

Posted 1 week ago

Apply

10.0 - 13.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Position Summary: Full Stack Engineer: Data Engineering- Job Description Cigna, a leading Health Services company, is looking for an exceptional engineer in our Data & Analytics Engineering organization. The Full Stack Engineer is responsible for the delivery of a business need starting from understanding the requirements to deploying the software into production. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the job and add value to the business. Critical attributes of being a Full Stack Engineer, among others, is ownership, eagerness to learn & an open mindset. In addition to Delivery, the Full Stack Engineer should have an automation first and continuous improvement mindset. Person should drive the adoption of CI/CD tools and support the improvement of the tools sets/processes. Job Description & Responsibilities : Behaviors of a Full Stack Engineer: Full Stack Engineers are able to articulate clear business objectives aligned to technical specifications and work in an iterative, agile pattern daily. They have ownership over their work tasks, and embrace interacting with all levels of the team and raise challenges when necessary. We aim to be cutting-edge engineers - not institutionalized developers. Experience Required: 11 - 13 years of experience in Python. 11 - 13 years of experience in Data Management & SQL expertise. 5+ years in Spark and AWS. 5+ years in Databrick. Experience with working in agile CI/CD environments. Experience Desired: Git ,Teradata & Snowflake experience. Experience working on Analytical Models and their deployment / production enable? ment via data & analytical pipelines. Expertise with big data technologies - Hadoop, HiveQL, (Scala/Python) Expertise on Cloud technologies - (S3, Glue, Terraform, Lambda, Aurora, Redshift, EMR). Experience with BDD and TDD development methodologies. Health care information domains preferred. Education and Training Required: Bachelor s degree (or equivalent) required. Primary Skills: Python, AWS and Spark. CI/CD , Databrick. Data management and SQL. Additional Skills: Strong communication skills. Take ownership and accountability. Write referenceable & modular code Be fluent areas and have proficiency in many areas Have a passion to learn. Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact. Take risks and champion new ideas. About Evernorth Health Services

Posted 1 week ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Develop solutions to complex problems that require the regular use of ingenuity and innovation. Design, build and deploy ETL using BODS. Performance optimization and understanding CDS view as a sources for the data extraction. Extract data from legacy systems to SAP or any Databases like Oracle, SQL Server, MySQL, Sybase, Teradata, and others. Lead team in data migration and conversion activities. Identify SAP Data Migration as-is processes and to-be processes. Write Technical Specification / Configuration / Testing documents. Coordinate with business or functional team on ERP business processes and data used for loading. EXPERIENCE: Minimum of 5 years of experience in SAP Data Conversion and Migration projects. Experience working in the ETL/Reporting/Business Intelligence applications. SAP BODS skill is a must have. Experience in Data profiling, Data Cleansing, Data Remediation. Functional knowledge in any FICO, SD, MM, PP, PM, QM, HCM, VC, EWM modules. Strong working with SQL query tools, understand entity relationship models, SAP Data Model, database relationships, primary, foreign key relationships. Worked on min 3 projects end to end SAP data migration projects. Experience in data validation, verification, and cleansing using any tools. Experience with SQL performance tuning and monitoring. Ability to read and write complex SQL statements, including multi-table joins, nested queries, and correlated subqueries. Knowledge in any hyperscalers Azure, AWS, Google, Alibaba, SAP. Knowledge in any cloud-based products - SAP, Salesforce, Ariba, others. Knowledge in Data Science, python, AI/ML will be an added advantage. Ability to collaborate effectively with project teams. Should have good communication skills and the ability to independently interact with client teams. Strong interpersonal, team building, organizational, and motivational skills. Certification in any of the Data Migration/ETL tools is preferred. EDUCATION: Bachelors degree required.

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Apply now » Apply Now Start applying with LinkedIn Please wait... Date: Jun 7, 2025 Location: Pune, MH, IN Company: HMH HMH is a learning technology company committed to delivering connected solutions that engage learners, empower educators and improve student outcomes. As a leading provider of K–12 core curriculum, supplemental and intervention solutions, and professional learning services, HMH partners with educators and school districts to uncover solutions that unlock students’ potential and extend teachers’ capabilities. HMH serves more than 50 million students and 4 million educators in 150 countries. HMH Technology India Pvt. Ltd. is our technology and innovation arm in India focused on developing novel products and solutions using cutting-edge technology to better serve our clients globally. HMH aims to help employees grow as people, and not just as professionals. For more information, visit www.hmhco.com The data architect is responsible for designing, creating, and managing an organization’s data architecture. This role is critical in establishing a solid foundation for data management within an organization, ensuring that data is organized, accessible, secure, and aligned with business objectives. The data architect designs data models, warehouses, file systems and databases, and defines how data will be collected and organized. Responsibilities Interprets and delivers impactful strategic plans improving data integration, data quality, and data delivery in support of business initiatives and roadmaps Designs the structure and layout of data systems, including databases, warehouses, and lakes Selects and designs database management systems that meet the organization’s needs by defining data schemas, optimizing data storage, and establishing data access controls and security measures Defines and implements the long-term technology strategy and innovations roadmaps across analytics, data engineering, and data platforms Designs processes for the ETL process from various sources into the organization’s data systems Translates high-level business requirements into data models and appropriate metadata, test data, and data quality standards Manages senior business stakeholders to secure strong engagement and ensures that the delivery of the project aligns with longer-term strategic roadmaps Simplifies the existing data architecture, delivering reusable services and cost-saving opportunities in line with the policies and standards of the company Leads and participates in the peer review and quality assurance of project architectural artifacts across the EA group through governance forums Defines and manages standards, guidelines, and processes to ensure data quality Works with IT teams, business analysts, and data analytics teams to understand data consumers’ needs and develop solutions Evaluates and recommends emerging technologies for data management, storage, and analytics Design, create, and implement logical and physical data models for both IT and business solutions to capture the structure, relationships, and constraints of relevant datasets Build and operationalize complex data solutions, correct problems, apply transformations, and recommend data cleansing/quality solutions Effectively collaborate and communicate with various stakeholders to understand data and business requirements and translate them into data models Create entity-relationship diagrams (ERDs), data flow diagrams, and other visualization tools to represent data models Collaborate with database administrators and software engineers to implement and maintain data models in databases, data warehouses, and data lakes Develop data modeling best practices, and use these standards to identify and resolve data modeling issues and conflicts Conduct performance tuning and optimization of data models for efficient data access and retrieval Incorporate core data management competencies, including data governance, data security and data quality Education Job Requirements A bachelor’s degree in computer science, data science, engineering, or related field Experience At least five years of relevant experience in design and implementation of data models for enterprise data warehouse initiatives Experience leading projects involving data warehousing, data modeling, and data analysis Design experience in Azure Databricks, PySpark, and Power BI/Tableau Skills Ability in programming languages such as Java, Python, and C/C++ Ability in data science languages/tools such as SQL, R, SAS, or Excel Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks) Experience with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata Understanding of entity-relationship modeling, metadata systems, and data quality tools and techniques Ability to think strategically and relate architectural decisions and recommendations to business needs and client culture Ability to assess traditional and modern data architecture components based on business needs Experience with business intelligence tools and technologies such as ETL, Power BI, and Tableau Ability to regularly learn and adopt new technology, especially in the ML/AI realm Strong analytical and problem-solving skills Ability to synthesize and clearly communicate large volumes of complex information to senior management of various technical understandings Ability to collaborate and excel in complex, cross-functional teams involving data scientists, business analysts, and stakeholders Ability to guide solution design and architecture to meet business needs Expert knowledge of data modeling concepts, methodologies, and best practices Proficiency in data modeling tools such as Erwin or ER/Studio Knowledge of relational databases and database design principles Familiarity with dimensional modeling and data warehousing concepts Strong SQL skills for data querying, manipulation, and optimization, and knowledge of other data science languages, including JavaScript, Python, and R Ability to collaborate with cross-functional teams and stakeholders to gather requirements and align on data models Excellent analytical and problem-solving skills to identify and resolve data modeling issues Strong communication and documentation skills to effectively convey complex data modeling concepts to technical and business stakeholders HMH Technology Private Limited is an Equal Opportunity Employer and considers applicants for all positions without regard to race, colour, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. We are committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. For more information, visit https://careers.hmhco.com/. Follow us on Twitter, Facebook, LinkedIn, and YouTube. Job Segment: Curriculum, Social Media, Education, Marketing Apply now » Apply Now Start applying with LinkedIn Please wait... Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Location: Bengaluru Shift Timings: 01:30 PM - 10:30 PM Job Description Responsibilities: Job Description: • Managing and supporting multiple SAS platforms like standalone , Grid, VA , Viya hosted on UNIX/LINUX servers. • Providing SAS platform security management, SAS application and underlying infrastructure support (OS, Storage, SAS 9.x EBI applications, Web and Databases) and ensuring processes are abide with organization policies. • Monitor the overall availability and performance of the current SAS server environments and take corrective actions as necessary to ensure peak operational performance. • Good knowledge of SAS ACLs and UNIX/LINUX security. • Monitor usage logs to assist in performance tuning of systems and servers. • Knowledge of applying hotfixes and renewing SAS licenses across the platforms. • Interact directly with various IT teams as it pertains to SAS Administration. • Schedule and coordinate all planned outages on the SAS server and be the point of contact for unplanned outages or other urgent issues related to SAS environment. • Should perform installation, upgrade and configuration of SAS technology products on all supported server environments when required. Required Qualifications: • 4 + years of Hands-on experience as a SAS Administrator of the 9.4 suite in Linux environments, performing system monitoring, log analysis, performance analysis, performance tuning and capacity planning (Version SAS 9.4) • 3 + years of hands–on experience as a SAS Grid Administrator of the 9.4 suite in a Linux environment (Version SAS 9.4 and LSF 9x and 10x), good knowledge on LSF, Grid Manager. • 2 + years of hands-on experience administering in SAS Visual Analytics Environment (Version SAS 9.4) • Understand Linux commands that will enable to monitor SAS jobs , ability to move around in Linux environments as required for basic functioning of SAS and a good understanding of how Linux and SAS are integrated. • Knowledgeable on schedulers like Cron , Autosys & SAS schedule manager. • Thorough understanding of SAS products and how they work together in an environment. • Sas data sources connectivity to the following Hadoop, Teradata, Oracle & SQL server. • Experience in troubleshooting SAS platform and client issues to root cause. • Working knowledge of the SAS Management Console (SMC), including creating Users, Groups and Roles, creating and securing folders, monitoring logs from within SMC, defining servers and changing server definitions and creating SAS, RDBMS libraries. • Working knowledge of SAS configuration files such as autoexec, configuration and environment files. • Strong analytical, communication, teamwork, problem-solving skills and interpersonal skills. Required Skills: • Minimum 7+ years of hands-on experience with GCP services including GKE, Filestore, IAM, and networking or other similar cloud technologies. • At least 5+ years of experience building Kubernetes clusters and container orchestration with deep expertise in GKE. • At least 5+ years of experience using Terraform for infrastructure as code, including complex cluster provisioning in production settings. • At least 3+ years of experience configuring and managing scalable storage solutions including GCP Filestore. • Experience in cloud migration projects, including operating in a hybrid cloud mode. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurgaon

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Data n’ Analytics – Data Strategy - Manager, Strategy and Transactions EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Manager - Data Strategy. The main objective of the role is to develop and articulate a clear and concise data strategy aligned with the overall business strategy. Communicate the data strategy effectively to stakeholders across the organization, ensuring buy-in and alignment. Establish and maintain data governance policies and procedures to ensure data quality, security, and compliance. Oversee data management activities, including data acquisition, integration, transformation, and storage. Develop and implement data quality frameworks and processes.The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. Discipline Data Strategy Key Skills Strong understanding of data models (relational, dimensional), data warehousing concepts, and cloud-based data architectures (AWS, Azure, GCP). Proficiency in data analysis techniques (e.g., SQL, Python, R), statistical modeling, and data visualization tools. Familiarity with big data technologies such as Hadoop, Spark, and NoSQL databases. Client Handling and Communication, Problem Solving, Systems thinking, Passion of technology, Adaptability, Agility, Analytical thinking, Collaboration Skills and attributes for success 10-12 years of total experience with 8+ years in Data Strategy and Architecture field Solid hands-on 6+ years of professional experience with designing and architecting of data warehouses/ data lakes on client engagements and helping create enhancements to a data warehouse Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) 5+ years’ experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 5+ years experience in various Azure services preferred – Azure Data Factory, Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics, Azure Analysis Services & Databricks Minimum of 8 years of hands-on database design, modelling and integration experience with relational data sources, such as SQL Server databases, Oracle/MySQL, Azure SQL and Azure Synapse Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills Strong project and people management skills with experience in serving global clients To qualify for the role, you must have Master’s Degree in Computer Science, Business Administration or equivalent work experience. Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analysing large volumes of data Relevant work experience of minimum 12 to 14 years in a big 4 or technology/ consulting set up Help incubate new finance analytic products by executing Pilot, Proof of Concept projects to establish capabilities and credibility with users and clients. This may entail working either as an independent SME or as part of a larger team Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Strong experience in SQL server and MS Excel plus atleast one other SQL dialect e.g. MS Access\Postgresql\Oracle PLSQL\MySQLStrong in Data Structures & Algorithm Experience of interfacing with databases such as Azure databases, SQL server, Oracle, Teradata etc Preferred exposure to JSON, Cloud Foundry, Pivotal, MatLab, Spark, Greenplum, Cassandra, Amazon Web Services, Microsoft Azure, Google Cloud, Informatica, Angular JS, Python, etc. What we look for A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership : We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture : You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies