Jobs
Interviews

1346 Teradata Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

2 - 7 Lacs

Bengaluru, Karnataka, India

On-site

In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Responsibilities Providing business and technical leadership to develop reporting and analytics team delivering across multiple MMB projects Responsible for maintaining partner relationships, ensuring high quality team deliverables and SLAs Working closely with US partners, interacting closely with multiple business partners anchor program managers Work independently, foster a culture of healthy and efficient working for the team Designing and solving complex business problems by analytical techniques and tools Will be involved directly in the technical build-out and/or support of databases, query tools, reporting tools, BI tools, dashboards, etc. that enable analysis, modeling, and/or advanced data visualization Recommends potential data sources; compiles and mines data from multiple, cross business sources. Works with typically very large data sets, both structured and unstructured, and from multiple sources Develops specific, customized reports, ad hoc analyses and/or data visualizations, formatted with business user-friendly techniques to drive adoption, such as (1) Excel macros/pivoting/filtering, (2) PowerPoint slides and presentations, and (3) clear verbal and e-mail communications Works with senior consultants or directly with partners, responsible for identifying and defining business requirements and translating business needs into moderately complex analyses and recommendations. Works with local and international colleagues and with internal customers, responsible for identifying and defining business requirements and catering to business needs for the team Ensures adherence to data management/data governance regulations and policies Applies knowledge of business, customers, and/or products/services/portfolios to synthesize data to form a story and align information to contrast/compare to industry perspective Ability to work overlap hours with US team Shift 1:30-10:30 pm IST J ob Expectations- 2 + years of experience in data, reporting, analytics or a combination of two; or a MS/MA degree or higher in a quantitative field such as applied math, statistics, engineering, physics, accounting, finance, economics, econometrics, computer sciences, or business/social and behavioral sciences with a quantitative emphasis 2+ years of hands-on experience in SQL especially in Oracle, SQL Server & Teradata environment 2+ years of experience in Tableau or SSRS or any BI Reporting tool- creating solutions with aid of data visualization. This includes but not limited to developing and creating BI dashboards, working on end to end reports, deriving insights from data Hands on experience in ETL development using Alteryx, or SSIS or any ETL tools Excellent verbal, written, and interpersonal communication skill Good data interpretation and presentation skills Willingness/ability to take ownership of a project Exceptionally fast learner, able to reach non-trivial SME level under tight deadlines High energy, can-do attitude, self-directed, pro-active, self-starter who excels in an environment with limited supervision Assist in managing multiple complex exercises in various stages simultaneously Assist in managing internal customer expectations while adhering to internal SLA timeframes Desired Qualifications: Extensive knowledge and understanding of data research and analysis Strong analytical skills with high attention to detail and accuracy Collaborative, team-focused attitude 2+ years of experience in Customer/ Marketing / Sales Analytics Experience with reporting tools like Business Objects, QlikView, etc. Role: Analytics Consultant Industry Type: IT Services & Consulting Department: Data Science & Analytics Employment Type: Full Time, Permanent Role Category: Business Intelligence & Analytics Education UG: B.Tech/B.E. in Any Specialization PG: M.A in Any Specialization

Posted 4 days ago

Apply

12.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Qualification BTech degree in computer science, engineering or related field of study or 12+ years of related work experience 7+ years design & implementation experience with large scale data centric distributed applications Professional experience architecting, operating cloud-based solutions with good understanding of core disciplines like compute, networking, storage, security, databases etc. Good understanding of data engineering concepts like storage, governance, cataloging, data quality, data modeling etc. Good understanding about various architecture patterns like data lake, data lake house, data mesh etc. Good understanding of Data Warehousing concepts, hands-on experience working with tools like Hive, Redshift, Snowflake, Teradata etc. Experience migrating or transforming legacy customer solutions to the cloud. Experience working with services like AWS EMR, Glue, DMS, Kinesis, RDS, Redshift, Dynamo DB, Document DB, SNS, SQS, Lambda, EKS, Data Zone etc. Thorough understanding of Big Data ecosystem technologies like Hadoop, Spark, Hive, HBase etc. and other competent tools and technologies Understanding in designing analytical solutions leveraging AWS cognitive services like Textract, Comprehend, Rekognition etc. in combination with Sagemaker is good to have. Experience working with modern development workflows, such as git, continuous integration/continuous deployment pipelines, static code analysis tooling, infrastructure-as-code, and more. Experience with a programming or scripting language – Python/Java/Scala AWS Professional/Specialty certification or relevant cloud expertise Role Drive innovation within Data Engineering domain by designing reusable and reliable accelerators, blueprints, and libraries. Capable of leading a technology team, inculcating innovative mindset and enable fast paced deliveries. Able to adapt to new technologies, learn quickly, and manage high ambiguity. Ability to work with business stakeholders, attend/drive various architectural, design and status calls with multiple stakeholders. Exhibit good presentation skills with a high degree of comfort speaking with executives, IT Management, and developers. Drive technology/software sales or pre-sales consulting discussions Ensure end-to-end ownership of all tasks being aligned. Ensure high quality software development with complete documentation and traceability. Fulfil organizational responsibilities (sharing knowledge & experience with other teams / groups) Conduct technical training(s)/session(s), write whitepapers/ case studies / blogs etc. Experience 10 to 18 years Job Reference Number 12895

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Skillset Coverage (Mandatory) : Strong SQL development background Hands-on experience in Teradata or BigQuery (at least one is mandatory) Ability to debug code and drive fixes end-to-end into production Familiarity with enterprise schedulers – Autosys preferred Exposure to Composer DAG (preferred) Good problem-solving and troubleshooting skills Experience with code deployment, validation, and testing workflows Thanks & Regards Prashant Awasthi Vastika Technologies PVT LTD 9711189829

Posted 5 days ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

🚀 We're Hiring: Senior Software Engineer 📍 Location: Chennai | 💼 Experience: 5–7 Years | 🕒 Full-Time Are you passionate about cutting-edge cloud technologies and modern development practices? We're looking for a Senior Software Engineer with strong experience in Java and GCP to join our innovative team and help build next-generation enterprise solutions. 🌟 What You’ll Do Develop scalable, secure REST-based microservices using Spring Boot, SpringCloud, Spring MVC Work in Java/J2EE environments (IntelliJ/Eclipse) with exposure to front-end technologies like JavaScript, Angular, React, HTML, and XML Lead the development and deployment of cloud-native applications using GCP , Docker/Kubernetes, OpenShift Design and manage CI/CD pipelines using tools like Jenkins, Tekton, GitHub, Gradle Work with relational and NoSQL databases: PostgreSQL, SQL Server, Teradata, BigQuery Handle streaming data using Kafka, MQTT, and modern cloud tools (e.g., Terraform) Champion clean code practices: TDD, Pair Programming, Evolutionary Design, and Agile/XP methodologies Collaborate across teams for product delivery , technical reviews, and performance optimization 🎯 Must-Have Skills Strong Java (J2EE) development with 2+ years in GCP Experience in CI/CD, Docker, Kubernetes , and container orchestration Solid background in PostgreSQL and other RDBMS Hands-on with cloud tools (e.g., BigQuery, Terraform) Skilled in Git , version control, and modern build tools ✅ Preferred Skills Experience in Extreme Programming (XP) practices like pair programming and TDD Exposure to microservices design using Spring Boot Familiarity with data streaming tools (Kafka, MQTT) Knowledge of software craftsmanship and clean coding standards 🎓 Qualifications Required: Bachelor's Degree in Computer Science or related field Preferred: Master's Degree Why Join Us? Work with a modern tech stack in a fast-paced Agile environment Be part of a team that values innovation, collaboration, and growth Opportunities to lead, learn, and grow with cutting-edge cloud technologies 📩 Ready to code the future? Apply now and let’s build something incredible together!

Posted 5 days ago

Apply

3.0 years

15 - 20 Lacs

Madurai, Tamil Nadu

On-site

Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra “Clients Vision is our Mission”. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Techmangohttps://www.techmango.net/ Job Title: GCP Data Engineer Location: Madurai Experience: 5+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Engineer, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 5+ years of experience in data architecture, data engineering, or enterprise data platforms. Minimum 3 years of hands-on experience in GCP Data Service. Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner. Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema). Experience with real-time data processing, streaming architectures, and batch ETL pipelines. Good understanding of IAM, networking, security models, and cost optimization on GCP. Prior experience in leading cloud data transformation projects. Excellent communication and stakeholder management skills. Preferred Qualifications: GCP Professional Data Engineer / Architect Certification. Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics. Exposure to AI/ML use cases and MLOps on GCP. Experience working in agile environments and client-facing roles. What We Offer: Opportunity to work on large-scale data modernization projects with global clients. A fast-growing company with a strong tech and people culture. Competitive salary, benefits, and flexibility. Collaborative environment that values innovation and leadership. Job Type: Full-time Pay: ₹1,500,000.00 - ₹2,000,000.00 per year Application Question(s): Current CTC ? Expected CTC ? Notice Period ? (If you are serving Notice period please mention the Last working day) Experience: GCP Data Architecture : 3 years (Required) BigQuery: 3 years (Required) Cloud Composer (Airflow): 3 years (Required) Location: Madurai, Tamil Nadu (Required) Work Location: In person

Posted 5 days ago

Apply

9.0 - 15.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title- Snowflake Data Architect Experience- 9 to 15 Years Location- Gurugram Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures. Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion, or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake. Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema, normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake: Architecture design, performance tuning, cost optimization. Strong proficiency in SQL, Python, and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion, or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms: AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks. Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI, or Looker. Certifications (Preferred/Required): ✅ Snowflake SnowPro Core Certification – Required or Highly Preferred ✅ SnowPro Advanced Architect Certification – Preferred ✅ Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ✅ ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture

Posted 5 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

When you join Verizon You want more out of a career. A place to share your ideas freely even if theyre daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You Will Be Doing... The Commercial Data & Analytics is a part of the Verizon Global Services (VGS) organization. As part of this team, you will be helping to create and deliver comprehensive reports and insights for Site Optimization and Real Time Marketing efforts across digital and non-digital channels for Mobile and Home customers. We are looking for an analyst to empower our Business Analytics and Insights teams to help arrive at a best in class customer experience in digital and non-digital channels. Work with extended teams like development, Adobe tagging, marketing PODs, Agile Teams, etc. Create comprehensive reporting for A/B or MVT tests. Generate insights and recommend to business or product owners the best experience. Collaborate with cross-functional teams to identify and prioritize automation opportunities. Develop and maintain automation frameworks, tools, and processes. Monitor and measure the effectiveness of experience experiments and make relevant recommendations to the stakeholders. Ability to create presentation and make presentations to multiple audiences. Provide technical guidance and collaborate with Development and Tagging teams. Work hand in hand with the Pods and Test leads. Locate and define new process improvement opportunities. What were looking for You Will Need To Have Bachelors degree in computer science or another technical field. Four or more years of relevant work experience. Deep understanding of Digital Analytics and campaign measurements. In-depth knowledge and experience of Experimentation and Optimization and experience in statistical analysis. Experience in tools like Adobe Analytics, Google Analytics, etc. Ability to write SQL queries and scripts. Experience working in Content square, session replay or tools with similar capabilities. Working knowledge of One Jira or any ticketing tool. Strong problem-solving, analytical, and research capabilities. Even Better if you have one or more of the following: Masters degree or PhD in Math, Statistics, Engineering, Computer Science, Data Science, or a related field. Strong verbal and written communication skills. Knowledge of relational databases like Teradata, EDW data warehouses. Experience with dashboard creation using visualization tools (Tableau/QLIK). Where youll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Locations: Chennai, India | Hyderabad, India

Posted 5 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Our vision is to transform how the world uses information to enrich life for all . Micron Technology is a world leader in innovating memory and storage solutions that accelerate the transformation of information into intelligence, inspiring the world to learn, communicate and advance faster than ever. Responsibilities Include, But Not Limited To Strong desire to grow a career as a Data Scientist in highly automated industrial manufacturing doing analysis and machine learning on terabytes and petabytes of diverse datasets. Experience in the areas: statistical modeling, feature extraction and analysis, supervised/unsupervised/semi-supervised learning. Exposure to the semiconductor industry is a plus but not a requirement. Ability to extract data from different databases via SQL and other query languages and applying data cleansing, outlier identification, and missing data techniques. Strong software development skills. Strong verbal and written communication skills. Experience with or desire to learn: Machine learning and other advanced analytical methods Fluency in Python and/or R pySpark and/or SparkR and/or SparklyR Hadoop (Hive, Spark, HBase) Teradata and/or another SQL databases Tensorflow, and/or other statistical software including scripting capability for automating analyses SSIS, ETL Javascript, AngularJS 2.0, Tableau Experience working with time-series data, images, semi-supervised learning, and data with frequently changing distributions is a plus Experience working with Manufacturing Execution Systems (MES) is a plus Existing papers from CVPR, NIPS, ICML, KDD, and other key conferences are plus, but this is not a research position About Micron Technology, Inc. We are an industry leader in innovative memory and storage solutions transforming how the world uses information to enrich life for all . With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence and 5G applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more, please visit micron.com/careers All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. To request assistance with the application process and/or for reasonable accommodations, please contact hrsupport_india@micron.com Micron Prohibits the use of child labor and complies with all applicable laws, rules, regulations, and other international and industry labor standards. Micron does not charge candidates any recruitment fees or unlawfully collect any other payment from candidates as consideration for their employment with Micron. AI alert : Candidates are encouraged to use AI tools to enhance their resume and/or application materials. However, all information provided must be accurate and reflect the candidate's true skills and experiences. Misuse of AI to fabricate or misrepresent qualifications will result in immediate disqualification. Fraud alert: Micron advises job seekers to be cautious of unsolicited job offers and to verify the authenticity of any communication claiming to be from Micron by checking the official Micron careers website in the About Micron Technology, Inc.

Posted 5 days ago

Apply

12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About The Role As part of the AI & Data organization, the Enterprise Business Intelligence (EBI) team is central to NXP’s data analytics success. We provide and maintain scalable data solutions, platforms, and methodologies that empower business users to create self-service analytics and drive data-informed decisions. We are seeking a Data Engineering Manager to lead a team of skilled Data Engineers. In this role, you will be responsible for overseeing the design, development, and maintenance of robust data pipelines and data models across multiple data platforms, including Databricks, Teradata, Postgres and others. You will collaborate closely with Product Owners, Architects, Data Scientists, and cross-functional stakeholders to ensure high-quality, secure, and scalable data solutions. Key Responsibilities Lead, mentor, and grow a team of Data Engineers, fostering a culture of innovation, collaboration, and continuous improvement. Oversee the design, development, and optimization of ETL/ELT pipelines and data workflows across multiple cloud and on-premise environments. Ensure data solutions align with enterprise architecture standards, including performance, scalability, security, privacy, and compliance. Collaborate with stakeholders to translate business requirements into technical specifications and data models. Drive adoption of best practices in data engineering, including code quality, testing, version control, and CI/CD. Partner with the Operational Support team to troubleshoot and resolve data issues and incidents. Stay current with emerging technologies and trends in data engineering and analytics. Required Skills & Qualifications Proven experience as a Data Engineer with at least 12+ years in ETL/ELT design and development. 5+ years of experience in a technical leadership or management role, with a track record of building and leading high-performing teams. Strong hands-on experience with cloud platforms (AWS, Azure) and their data services (e.g., S3, Redshift, Glue, Azure Data Factory, Synapse). Proficiency in SQL, Python, and PySpark for data transformation and processing. Experience with data orchestration tools and CI/CD pipelines (GitHub Actions, GitLab CI). Familiarity with data modeling, data warehousing, and data lake architectures. Understanding of data governance, security, and compliance frameworks (e.g., GDPR, HIPAA). Excellent communication and stakeholder management skills. Preferred Skills & Qualifications Experience with Agile methodologies and DevOps practices. Proficiency with Databricks, Teradata, Postgres, Fivetran HVR and DBT. Knowledge of AI/ML workflows and integration with data pipelines. Experience with monitoring and observability tools. Familiarity with data cataloging and metadata management tools (e.g., Alation, Collibra). More information about NXP in India...

Posted 5 days ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role Overview We are looking for an experienced Solution Architect AI/ML & Data Engineering to lead the design and delivery of advanced data and AI/ML solutions for our clients. Responsibilities The ideal candidate will have a strong background in end-to-end data architecture, AI lifecycle management, cloud technologies, and emerging Generative AI Responsibilities : Collaborate with clients to understand business requirements and design robust data solutions. Lead the development of end-to-end data pipelines including ingestion, storage, processing, and visualization. Architect scalable, secure, and compliant data systems following industry best practices. Guide data engineers, analysts, and cross-functional teams to ensure timely delivery of solutions. Participate in pre-sales efforts: solution design, proposal creation, and client presentations. Act as a technical liaison between clients and internal teams throughout the project lifecycle. Stay current with emerging technologies in AI/ML, data platforms, and cloud services. Foster long-term client relationships and identify opportunities for business expansion. Understand and architect across the full AI lifecyclefrom ingestion to inference and operations. Provide hands-on guidance for containerization and deployment using Kubernetes. Ensure proper implementation of data governance, modeling, and warehousing : Bachelors or masters degree in computer science, Data Science, or related field. 10+ years of experience as a Data Solution Architect or similar role. Deep technical expertise in data architecture, engineering, and AI/ML systems. Strong experience with Hadoop-based platforms, ideally Cloudera Data Platform or Data Fabric. Proven pre-sales experience: technical presentations, solutioning, and RFP support. Proficiency in cloud platforms (Azure preferred; also, AWS or GCP) and cloud-native data tools. Exposure to Generative AI frameworks and LLMs like OpenAI and Hugging Face. Experience in deploying and managing applications on Kubernetes (AKS, EKS, GKE). Familiarity with data governance, data modeling, and large-scale data warehousing. Excellent problem-solving, communication, and client-facing & Technology Architecture & Engineering: Hadoop Ecosystem: Cloudera Data Platform, Data Fabric, HDFS, Hive, Spark, HBase, Oozie. ETL & Integration: Apache NiFi, Talend, Informatica, Azure Data Factory, AWS Glue. Warehousing: Azure Synapse, Redshift, BigQuery, Snowflake, Teradata, Vertica. Streaming: Apache Kafka, Azure Event Hubs, AWS Platforms: Azure (preferred), AWS, GCP. Data Lakes: ADLS, AWS S3, Google Cloud Platforms: Data Fabric, AI Essentials, Unified Analytics, MLDM, MLDE. AI/ML & GenAI Lifecycle Tools: MLflow, Kubeflow, Azure ML, SageMaker, Ray. Inference: TensorFlow Serving, KServe, Seldon. Generative AI: Hugging Face, LangChain, OpenAI API (GPT-4, etc. DevOps & Deployment Kubernetes: AKS, EKS, GKE, Open Source K8s, Helm. CI/CD: Jenkins, GitHub Actions, GitLab CI, Azure DevOps. (ref:hirist.tech)

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer, you will be responsible for designing, building, and maintaining data pipelines to ensure data quality and reliability. Your main tasks will include data cleansing, imputation, mapping to standard data models, transforming data to meet business rules and statistical computations, and validating data content. You will be developing, modifying, and maintaining Python and Unix scripts, as well as complex SQL queries. Performance tuning of existing code to enhance efficiency and avoid bottlenecks will be a key part of your role. Building end-to-end data flows from various sources to curated and enhanced datasets will be crucial. Additionally, you will develop automated Python jobs for data ingestion and provide technical expertise in architecture, design, and implementation. Collaborating with team members, you will create insightful reports and dashboards to improve processes and add value. SQL query writing for data validation and designing ETL processes for data extraction, transformation, and loading will also be part of your responsibilities. You will work closely with data architects, analysts, and stakeholders to understand data requirements and ensure data quality. Optimizing and tuning ETL processes for performance and scalability will be essential. Maintaining documentation for ETL processes, data flows, and mappings, as well as monitoring and troubleshooting ETL processes to ensure data accuracy and availability, will be key responsibilities. Implementing data validation and error handling mechanisms to maintain data integrity and consistency will also be part of your role. Required Skills: - Python - ETL Tools like Informatica, Talend, SSIS, or similar - SQL, MySQL - Expertise in Oracle, SQL Server, and Teradata - DevOps, GitLab - Experience in AWS Glue or Azure Data Factory If you are passionate about data engineering and have the skills mentioned above, we would love to have you on our team!,

Posted 6 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Senior Manager of Software Engineering at JPMorgan Chase within the Consumer and Community Banking – Data Technology team, you lead a technical area and drive impact within teams, technologies, and projects across departments. Utilize your in-depth knowledge of software, applications, technical processes, and product management to drive multiple complex projects and initiatives, while serving as a primary decision maker for your teams and be a driver of innovation and solution delivery. Job Responsibilities Leads Data publishing and processing platform engineering team to achieve business & technology objectives Accountable for technical tools evaluation, build platforms, design & delivery outcomes Carries governance accountability for coding decisions, control obligations, and measures of success such as cost of ownership, maintainability, and portfolio operations Delivers technical solutions that can be leveraged across multiple businesses and domains Influences peer leaders and senior stakeholders across the business, product, and technology teams Champions the firm’s culture of diversity, equity, inclusion, and respect Required Qualifications, Capabilities, And Skills Formal training or certification on software engineering concepts and 5+ years applied experience. In addition, 2 + years of experience leading technologists to manage and solve complex technical items within your domain of expertise Expertise in programming languages such as Python and Java, with a strong understanding of cloud services including AWS, EKS, SNS, SQS, Cloud Formation, Terraform, and Lambda. Proficient in messaging services like Kafka and big data technologies such as Hadoop, Spark-SQL, and Pyspark. Experienced with Teradata or Snowflake, or any other RDBMS databases, with a solid understanding of Teradata or Snowflake. Advanced experience in leading technologists to manage, anticipate, and solve complex technical challenges, along with experience in developing and recognizing talent within cross-functional teams. Experience in leading a product as a Product Owner or Product Manager, with practical cloud-native experience. Preferred Qualifications, Capabilities, And Skills Previous experience leading / building Platforms & Frameworks teams Skilled in orchestration tools like Airflow (preferable) or Control-M, and experienced in continuous integration and continuous deployment (CICD) using Jenkins. Experience with Observability tools, frameworks and platforms. Experience with large scalable secure distributed complex architecture and design Experience with nonfunctional topics like security, performance, code and design best practices AWS Certified Solutions Architect, AWS Certified Developer, or similar certification is a big plus. ABOUT US

Posted 6 days ago

Apply

5.0 - 10.0 years

50 - 55 Lacs

Vijayawada, Visakhapatnam, Guntur

Work from Office

We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Senior Manager of Software Engineering at JPMorgan Chase within the Consumer and Community Banking Data Technology team, you lead a technical area and drive impact within teams, technologies, and projects across departments. Utilize your in-depth knowledge of software, applications, technical processes, and product management to drive multiple complex projects and initiatives, while serving as a primary decision maker for your teams and be a driver of innovation and solution delivery. Job Responsibilities Leads Data publishing and processing platform engineering team to achieve business & technology objectives Accountable for technical tools evaluation, build platforms, design & delivery outcomes Carries governance accountability for coding decisions, control obligations, and measures of success such as cost of ownership, maintainability, and portfolio operations Delivers technical solutions that can be leveraged across multiple businesses and domains Influences peer leaders and senior stakeholders across the business, product, and technology teams Champions the firm s culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience. In addition, 2 + years of experience leading technologists to manage and solve complex technical items within your domain of expertise Expertise in programming languages such as Python and Java, with a strong understanding of cloud services including AWS, EKS, SNS, SQS, Cloud Formation, Terraform, and Lambda. Proficient in messaging services like Kafka and big data technologies such as Hadoop, Spark-SQL, and Pyspark. Experienced with Teradata or Snowflake, or any other RDBMS databases, with a solid understanding of Teradata or Snowflake. Advanced experience in leading technologists to manage, anticipate, and solve complex technical challenges, along with experience in developing and recognizing talent within cross-functional teams. Experience in leading a product as a Product Owner or Product Manager, with practical cloud-native experience. Preferred qualifications, capabilities, and skills Previous experience leading / building Platforms & Frameworks teams Skilled in orchestration tools like Airflow (preferable) or Control-M, and experienced in continuous integration and continuous deployment (CICD) using Jenkins. Experience with Observability tools, frameworks and platforms. Experience with large scalable secure distributed complex architecture and design Experience with nonfunctional topics like security, performance, code and design best practices AWS Certified Solutions Architect, AWS Certified Developer, or similar certification is a big plus. We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Senior Manager of Software Engineering at JPMorgan Chase within the Consumer and Community Banking Data Technology team, you lead a technical area and drive impact within teams, technologies, and projects across departments. Utilize your in-depth knowledge of software, applications, technical processes, and product management to drive multiple complex projects and initiatives, while serving as a primary decision maker for your teams and be a driver of innovation and solution delivery. Job Responsibilities Leads Data publishing and processing platform engineering team to achieve business & technology objectives Accountable for technical tools evaluation, build platforms, design & delivery outcomes Carries governance accountability for coding decisions, control obligations, and measures of success such as cost of ownership, maintainability, and portfolio operations Delivers technical solutions that can be leveraged across multiple businesses and domains Influences peer leaders and senior stakeholders across the business, product, and technology teams Champions the firm s culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience. In addition, 2 + years of experience leading technologists to manage and solve complex technical items within your domain of expertise Expertise in programming languages such as Python and Java, with a strong understanding of cloud services including AWS, EKS, SNS, SQS, Cloud Formation, Terraform, and Lambda. Proficient in messaging services like Kafka and big data technologies such as Hadoop, Spark-SQL, and Pyspark. Experienced with Teradata or Snowflake, or any other RDBMS databases, with a solid understanding of Teradata or Snowflake. Advanced experience in leading technologists to manage, anticipate, and solve complex technical challenges, along with experience in developing and recognizing talent within cross-functional teams. Experience in leading a product as a Product Owner or Product Manager, with practical cloud-native experience. Preferred qualifications, capabilities, and skills Previous experience leading / building Platforms & Frameworks teams Skilled in orchestration tools like Airflow (preferable) or Control-M, and experienced in continuous integration and continuous deployment (CICD) using Jenkins. Experience with Observability tools, frameworks and platforms. Experience with large scalable secure distributed complex architecture and design Experience with nonfunctional topics like security, performance, code and design best practices AWS Certified Solutions Architect, AWS Certified Developer, or similar certification is a big plus.

Posted 6 days ago

Apply

2.0 - 7.0 years

12 - 16 Lacs

Mumbai

Work from Office

Context KPMG entities in India are professional service firms(s). These Indian member firms are affiliated with KPMG international limited. We strive to provide rapid, performance-based, industry-focused and technology-enabled service, which reflect a shared knowledge of global and local industries and out experience of the Indian business environment. We are creating a strategic solution architecture horizontal team to own, translate and drive this vision into various verticals, business or technology capability block owners and strategic projects. Job Description Role Objective: Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend). Location- Mumbai Years of Experience - 3-5 yrs Roles & Responsibilities: Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution. Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape. Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance. Understanding Coding Standards, Compliance & Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices. Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution. Technical Skills: Core Tool exposure Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.) Core Concepts ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement Cloud exposure Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.) SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding Soft Skills- Very good communication and presentation skills Must be able to articulate the thoughts and convince key stakeholders Should be able to guide and upskill team members Good to Have: Programming Language: Knowledge and hands-on experience with languages like Python and R. Relevant certifications related to the role. Total Experience: 1. Total experience in data engineering and data lake/data warehouse between 2-7 years. 2. Primary skill set in any ETL Tool (SSIS / Talend / Informatica / DataStage) 3. Must have good experience on working on any database (Teradata, Snowflake, Oracle, SQL Server, Sybase IQ, Postgre SQL, Redshift, Synapse) and able to write SQL queries/scripts 4. Good team player, ability to own work assigned and take it to closure 5. Should have clear understanding of concepts of Data Lake/Data warehouse. .

Posted 6 days ago

Apply

3.0 - 5.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Lowe s is a FORTUNE 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. Lowe s India, the Global Capability Center of Lowe s Companies Inc., is a hub for driving our technology, business, analytics, and shared services strategy. Based in Bengaluru with over 4,500 associates, it powers innovations across omnichannel retail, AI/ML, enterprise architecture, supply chain, and customer experience. From supporting and launching homegrown solutions to fostering innovation through its Catalyze platform, Lowe s India plays a pivotal role in transforming home improvement retail while upholding strong commitment to social impact and sustainability. For more information, visit Lowes India About Team The Customer Insights team uses data and analytics capabilities to provide strategic insights around customer and competitive landscape to help drive effective market strategy for Lowe s. The team works closely with various key functions across the business to provide insights and recommendations across different business areas. The team is responsible to track, report and analyze various customer and market related metrics and generate actionable insights on key customer segments, customer experience vis- -vis Lowe s and our market position along with opportunities to help inform an effective Go To Market Strategy and win market share across key markets, customer segments and product categories. Job Summary: The primary responsibility is to report, analyze and provide insights on Customer Experience at Lowe s across selling channels, customer segments and product categories. The individual will apply analytical methods to combine internal and external data and analyze trends in Customer Experience Metrics and the factors that play a key role driving improvement in those metrics. This position is responsible for following best practices in turning business questions into data analysis, analyzing results and identifying insights for decision making; determine additional research/analytics that may be needed to enhance the solution; and coordinate with cross-functional teams to ensure project continuity and strategic alignment. The individual will also have to proactively take up initiatives to apply modern tools and techniques to improve efficiency, accuracy and overall quality of insights offered to stakeholders. Roles & Responsibilities: Core Responsibilities: Analyze Customer feedback, LTR, NPS data to understand in-market trends and where to focus to improve Customer experience. Work with our US (Mooresville) team to assist them in defining various reporting / analysis needs and building appropriate methodologies to provide actionable insights on Experience Metrics. Identifying the appropriate univariate and multivariate analysis to identify key customer trends and insights Segmentation, Bayesian Networks, Factor analysis etc. Synthesize disparate sources of data primary and secondary to develop cohesive stories, trends and insights that will drive strategic decision making across the enterprise. Leverage available information across workstreams to help connect dots and provide holistic insights on Customer Experience trends and Strategy. Work with the data operation teams to enhance data capabilities and develop tools to improve ease of access to data and BI for the broader organization Years of Experience: 3-5 Years Hands on experience with Customer Experience Analytics / Customer Analytics / Customer Insights Education Qualification & Certifications (optional) Required Minimum Qualifications : Master s degree in Economics / Statistics / Analytics or MBA in Marketing Skill Set Required Primary Skills (must have) Hands on experience in SQL, Teradata, Hadoop, Python Hands-on Analytics Experience of building Statistical / Mathematical models and multivariate analysis like Segmentation, Logistic Regression, Bayesian Networks, Factor analysis, Conjoint analysis, etc. Ability to apply Analytical tools and techniques to extract insights from Structured and Unstructured data. Consulting Skills - Ability to impact business decisions through analytics and research. Hands on experience in creating Executive level audience ready presentations to tell impactful stories. Excellent communication skill to connect with people from diverse background and experience Secondary Skills (desired) Experience on working with text data would be an advantage. Experience in working with Customer Experience or Voice of Customer Metrics will be a good to have. Familiarity with retail industry and key concepts

Posted 6 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Position Title: Senior Software Engineer 34258 Job Type: Full-Time Work Mode: Hybrid Location: Chennai Budget: ₹21 LPA Notice Period: Immediate Joiners Preferred Role Overview We are seeking a seasoned Senior Software Engineer with strong expertise in Java/J2EE and Google Cloud Platform (GCP) . The role involves the development and optimization of microservices, cloud-native applications, and robust backend systems using modern cloud technologies. The ideal candidate will have a solid track record in building scalable applications and cloud solutions using CI/CD, Docker, and PostgreSQL. Key Responsibilities Design and develop REST-based microservices using Spring Boot, Spring Cloud, and Java. Work on full-stack applications, including frontend components built with JavaScript, Angular, or React. Utilize containerization and orchestration tools such as Docker, Kubernetes, and OpenShift. Develop in cloud environments, primarily Google Cloud Platform (GCP), and optionally Azure. Build CI/CD pipelines using tools like Tekton, Jenkins, GitHub, and Gradle. Work with relational databases like PostgreSQL, SQL Server, Teradata, and big data tools like BigQuery. Implement and optimize cloud infrastructure using tools such as Terraform. Embrace and promote Test-Driven Development (TDD), clean coding practices, and Agile methodologies. Collaborate in paired or mob programming sessions and contribute to a lean, MVP-focused development cycle. Must-Have Skills Java / J2EE (5+ years) GCP (2+ years hands-on experience) Spring Boot, Spring MVC, Spring Cloud PostgreSQL and SQL data manipulation Docker, Kubernetes, and container orchestration CI/CD tools: GitHub, Tekton, Jenkins Experience with REST APIs, microservices architecture Nice-to-Have Skills Angular / React / JavaScript Experience with Kafka and MQTT Exposure to big data tools and cloud optimization Familiarity with Clean Code, XP practices, and software craftsmanship Experience Required 5 to 7 years of experience in software engineering, preferably with strong backend and cloud exposure Education Bachelor’s degree in Computer Science, Engineering, or a related field Master’s degree (Preferred) Skills: rest apis,big data tools,terraform,j2ee,agile methodologies,clean coding practices,spring cloud,docker,java,microservices,spring mvc,spring boot,ci,javascript,kafka,ci/cd,react,postgresql,google cloud platform (gcp),data,sql,angular,kubernetes,tekton,github,microservices architecture,jenkins,mqtt

Posted 6 days ago

Apply

0 years

5 - 8 Lacs

Thiruvananthapuram

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. About EY GDS Global Delivery Services (GDS) is a dynamic and truly global delivery network. Across our six locations, we work with teams from all EY service lines, geographies and sectors, and play a vital role in the delivery of the EY growth strategy. We operate from six countries and sixteen cities: Argentina (Buenos Aires) China (Dalian) India (Bangalore, Chennai, Gurgaon, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Trivandrum) Philippines (Manila) Poland (Warsaw and Wroclaw) UK (Manchester, Liverpool) Careers in EY Global Delivery Services Join a team of over 50,000 people, working across borders, to provide innovative and strategic business solutions to EY member firms around the world. Join one of our dynamic teams From accountants to coders, we offer a wide variety of fulfilling career opportunities that span all business disciplines Our Consulting practice provides differentiated focus on the key business themes to help our clients solve better questions around technology. Our vision is to be recognized as a leading provider of differentiated technology consulting services, harnessing new disruptive technology, alliances and attracting talented people to solve our clients' issues. It's an exciting time to join us and grow your career as a technology professional. A technology career is about far more than leading-edge innovations. It’s about the application of these technologies in the real world to make a real, meaningful impact. We are looking for highly motivated, articulate individuals who have the skills to the technology lifecycle and are passionate about designing innovative solutions to solve complex business problems. Your career in Consulting can span across these technology areas/ services lines: Digital Technologies: We are a globally integrated digital architecture and engineering team. Our mission is to deliver tailored, custom-built end to end solutions to our customers that are Digital, Cloud Native and Open Source. Our skills include Experience design, UI development, Design Thinking, Architecture & Design, Full stack development (.Net/ Java/ SharePoint/ Power Platform), Emerging Technologies like Block Chain, IoT, AR\VR, Drones, Cloud and DevSecOps. We use industrialized techniques, built on top of agile methods utilizing our global teams to deliver end to end solutions at best unit cost proposition. Testing Services: We are the yardstick of quality software product. We break something to make the product stronger and successful. We provide entire gamut of testing services including Busines / User acceptance testing. Hence this is a team with all round skills such as functional, technical and process. Data & Analytics: Data and Analytics is amongst the largest and most versatile practices within EY. Our sector and domain expertise combined with technical skills in data, cloud, advanced analytics and artificial intelligence differentiates us in the industry. Our talented team possesses cross-sector and cross-domain expertise and a wide array of skills in Information Management (IM), Business Intelligence (BI), Advance Analytics (AA) and Artificial Intelligence (AI) Oracle: We provide one-stop solution for end-to-end project implementation enabled by Oracle and IBM Products. We use proven methodologies, tools and accelerators to jumpstart and support large Risk and Finance Transformation. We develop solutions using various languages such as SQL or PL/ SQL, Java, Java Script, Python, IBM Maximo and other Oracle Utilities. We also provide consulting services for streamlining the current reporting process using various Enterprise Performance Management tools. SAP: By building on SAP’s S/4HANA digital core and cloud services, EY and SAP are working to help organizations leverage industry-leading technologies to improve operational performance. This collaboration helps drive digital transformation for our clients across areas including finance, human resources, supply chain and procurement. Our goal is to support clients as they initiate or undergo major transformation. Our capabilities span end-to-end solution implementation services from strategy and architecture to production deployment. EY supports clients in three main areas, Technology implementation support, Enterprise and Industry application implementation, Governance Risk Compliance (GRC) Technology. Banking and Capital Market Services: Banking and Capital Market Services companies are transforming their complex tax and finance functions with technologies such as AI and ML. With the right blend of core competencies, tax and finance personnel will shift to data, process and technology skills to service global clients on their Core Banking Platforms and support their business / digital transformation like Deposit system replacements, lending / leasing modernization, Cloud–native architecture (Containerization) etc. Wealth and Asset Management: We help our clients thrive in a transformative age by providing innovative services to global and domestic asset management clients to increase efficiency, effectiveness and manage the overall impact on bottom line profitability by leveraging the technology, data and digital teams. We do many operational efficiency programs and Technology Enabled Transformation to re-platform their front and Back offices with emerging technologies like AI, ML, Blockchain etc. Insurance Transformation: The current changing Macroeconomic trends continue to challenge Insurers globally. However, with disruptive technologies – including IoT, autonomous vehicles, Blockchain etc, we help companies through these challenges and create innovative strategies to transform their business through technology enabled transformation programs. We provide end to end services to Global P&C (General), Life and Health Insurers, Reinsurers and Insurance brokers. Cyber Security: The ever-increasing risk and complexity surrounding cybersecurity and privacy has put cybersecurity at the top of the agenda for senior management, the Board of Directors, and regulators. We help our clients to understand and quantify their cyber risk, prioritize investments, and embed security, privacy and resilience into every digitally-enabled initiative – from day one. Technology Risk: A practice that is a unique, industry-focused business unit that provides a broad range of integrated services where you’ll contribute technically to IT Risk and Assurance client engagements and internal projects. An important part of your role will be to actively establish, maintain and strengthen internal and external relationships. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. Behavioral Competencies: Adaptive to team and fosters collaborative approach Innovative approach to the project, when required Shows passion and curiosity, desire to learn and can think digital Agile mindset and ability to multi-task Must have an eye for detail Skills needed: Should have understanding and/or experience of software development best practices and software development life cycle Understanding of one/more programming languages such as Java/ .Net/ Python, data analytics or databases such as SQL/ Oracle/ Teradata etc. Internship in a relevant technology domain will be an added advantage Qualification: BE - B. Tech / (IT/ Computer Science/ Circuit branches) Should have secured 60% and above No active Backlogs EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

0 years

5 - 8 Lacs

Thiruvananthapuram

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. About EY GDS Global Delivery Services (GDS) is a dynamic and truly global delivery network. Across our six locations, we work with teams from all EY service lines, geographies and sectors, and play a vital role in the delivery of the EY growth strategy. We operate from six countries and sixteen cities: Argentina (Buenos Aires) China (Dalian) India (Bangalore, Chennai, Gurgaon, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Trivandrum) Philippines (Manila) Poland (Warsaw and Wroclaw) UK (Manchester, Liverpool) Careers in EY Global Delivery Services Join a team of over 50,000 people, working across borders, to provide innovative and strategic business solutions to EY member firms around the world. Join one of our dynamic teams From accountants to coders, we offer a wide variety of fulfilling career opportunities that span all business disciplines Our Consulting practice provides differentiated focus on the key business themes to help our clients solve better questions around technology. Our vision is to be recognized as a leading provider of differentiated technology consulting services, harnessing new disruptive technology, alliances and attracting talented people to solve our clients' issues. It's an exciting time to join us and grow your career as a technology professional. A technology career is about far more than leading-edge innovations. It’s about the application of these technologies in the real world to make a real, meaningful impact. We are looking for highly motivated, articulate individuals who have the skills to the technology lifecycle and are passionate about designing innovative solutions to solve complex business problems. Your career in Consulting can span across these technology areas/ services lines: Digital Technologies: We are a globally integrated digital architecture and engineering team. Our mission is to deliver tailored, custom-built end to end solutions to our customers that are Digital, Cloud Native and Open Source. Our skills include Experience design, UI development, Design Thinking, Architecture & Design, Full stack development (.Net/ Java/ SharePoint/ Power Platform), Emerging Technologies like Block Chain, IoT, AR\VR, Drones, Cloud and DevSecOps. We use industrialized techniques, built on top of agile methods utilizing our global teams to deliver end to end solutions at best unit cost proposition. Testing Services: We are the yardstick of quality software product. We break something to make the product stronger and successful. We provide entire gamut of testing services including Busines / User acceptance testing. Hence this is a team with all round skills such as functional, technical and process. Data & Analytics: Data and Analytics is amongst the largest and most versatile practices within EY. Our sector and domain expertise combined with technical skills in data, cloud, advanced analytics and artificial intelligence differentiates us in the industry. Our talented team possesses cross-sector and cross-domain expertise and a wide array of skills in Information Management (IM), Business Intelligence (BI), Advance Analytics (AA) and Artificial Intelligence (AI) Oracle: We provide one-stop solution for end-to-end project implementation enabled by Oracle and IBM Products. We use proven methodologies, tools and accelerators to jumpstart and support large Risk and Finance Transformation. We develop solutions using various languages such as SQL or PL/ SQL, Java, Java Script, Python, IBM Maximo and other Oracle Utilities. We also provide consulting services for streamlining the current reporting process using various Enterprise Performance Management tools. SAP: By building on SAP’s S/4HANA digital core and cloud services, EY and SAP are working to help organizations leverage industry-leading technologies to improve operational performance. This collaboration helps drive digital transformation for our clients across areas including finance, human resources, supply chain and procurement. Our goal is to support clients as they initiate or undergo major transformation. Our capabilities span end-to-end solution implementation services from strategy and architecture to production deployment. EY supports clients in three main areas, Technology implementation support, Enterprise and Industry application implementation, Governance Risk Compliance (GRC) Technology. Banking and Capital Market Services: Banking and Capital Market Services companies are transforming their complex tax and finance functions with technologies such as AI and ML. With the right blend of core competencies, tax and finance personnel will shift to data, process and technology skills to service global clients on their Core Banking Platforms and support their business / digital transformation like Deposit system replacements, lending / leasing modernization, Cloud–native architecture (Containerization) etc. Wealth and Asset Management: We help our clients thrive in a transformative age by providing innovative services to global and domestic asset management clients to increase efficiency, effectiveness and manage the overall impact on bottom line profitability by leveraging the technology, data and digital teams. We do many operational efficiency programs and Technology Enabled Transformation to re-platform their front and Back offices with emerging technologies like AI, ML, Blockchain etc. Insurance Transformation: The current changing Macroeconomic trends continue to challenge Insurers globally. However, with disruptive technologies – including IoT, autonomous vehicles, Blockchain etc, we help companies through these challenges and create innovative strategies to transform their business through technology enabled transformation programs. We provide end to end services to Global P&C (General), Life and Health Insurers, Reinsurers and Insurance brokers. Cyber Security: The ever-increasing risk and complexity surrounding cybersecurity and privacy has put cybersecurity at the top of the agenda for senior management, the Board of Directors, and regulators. We help our clients to understand and quantify their cyber risk, prioritize investments, and embed security, privacy and resilience into every digitally-enabled initiative – from day one. Technology Risk: A practice that is a unique, industry-focused business unit that provides a broad range of integrated services where you’ll contribute technically to IT Risk and Assurance client engagements and internal projects. An important part of your role will be to actively establish, maintain and strengthen internal and external relationships. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. Behavioral Competencies: Adaptive to team and fosters collaborative approach Innovative approach to the project, when required Shows passion and curiosity, desire to learn and can think digital Agile mindset and ability to multi-task Must have an eye for detail Skills needed: Should have understanding and/or experience of software development best practices and software development life cycle Understanding of one/more programming languages such as Java/ .Net/ Python, data analytics or databases such as SQL/ Oracle/ Teradata etc. Internship in a relevant technology domain will be an added advantage Qualification: BE - B. Tech / (IT/ Computer Science/ Circuit branches) Should have secured 60% and above No active Backlogs EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

JOB DESCRIPTION We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Senior Manager of Software Engineering at JPMorgan Chase within the Consumer and Community Banking – Data Technology team, you lead a technical area and drive impact within teams, technologies, and projects across departments. Utilize your in-depth knowledge of software, applications, technical processes, and product management to drive multiple complex projects and initiatives, while serving as a primary decision maker for your teams and be a driver of innovation and solution delivery. Job Responsibilities Leads Data publishing and processing platform engineering team to achieve business & technology objectives Accountable for technical tools evaluation, build platforms, design & delivery outcomes Carries governance accountability for coding decisions, control obligations, and measures of success such as cost of ownership, maintainability, and portfolio operations Delivers technical solutions that can be leveraged across multiple businesses and domains Influences peer leaders and senior stakeholders across the business, product, and technology teams Champions the firm’s culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience. In addition, 2 + years of experience leading technologists to manage and solve complex technical items within your domain of expertise Expertise in programming languages such as Python and Java, with a strong understanding of cloud services including AWS, EKS, SNS, SQS, Cloud Formation, Terraform, and Lambda. Proficient in messaging services like Kafka and big data technologies such as Hadoop, Spark-SQL, and Pyspark. Experienced with Teradata or Snowflake, or any other RDBMS databases, with a solid understanding of Teradata or Snowflake. Advanced experience in leading technologists to manage, anticipate, and solve complex technical challenges, along with experience in developing and recognizing talent within cross-functional teams. Experience in leading a product as a Product Owner or Product Manager, with practical cloud-native experience. Preferred qualifications, capabilities, and skills Previous experience leading / building Platforms & Frameworks teams Skilled in orchestration tools like Airflow (preferable) or Control-M, and experienced in continuous integration and continuous deployment (CICD) using Jenkins. Experience with Observability tools, frameworks and platforms. Experience with large scalable secure distributed complex architecture and design Experience with nonfunctional topics like security, performance, code and design best practices AWS Certified Solutions Architect, AWS Certified Developer, or similar certification is a big plus. ABOUT US

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

Andhra Pradesh

On-site

ABOUT EVERNORTH: Evernorth℠ exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others don’t, won’t or can’t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Position Overview Excited to grow your career? This position’s primary responsibility will be to translate software requirements into functions using Mainframe , ETL , Data Engineering with expertise in Databricks and Database technologies. This position offers the opportunity to work on modernizing legacy systems, contribute to cloud infrastructure automation, and support production systems in a fast-paced, agile environment. You will work across multiple teams and technologies to ensure reliable, high-performance data solutions that align with business goals. As a Mainframe & ETL Engineer, you will be responsible for the end-to-end development and support of data processing solutions using tools such as Talend, Ab Initio, AWS Glue, and PySpark, with significant work on Databricks and modern cloud data platforms. You will support infrastructure provisioning using Terraform, assist in modernizing legacy systems including mainframe migration, and contribute to performance tuning of complex SQL queries across multiple database platforms including Teradata, Oracle, Postgres, and DB2. You will also be involved in CI/CD practices Responsibilities Support, maintain and participate in the development of software utilizing technologies such as COBOL, DB2, CICS and JCL. Support, maintain and participate in the ETL development of software utilizing technologies such as Talend, Ab-Initio, Python, PySpark using Databricks. Work with Databricks to design and manage scalable data processing solutions. Implement and support data integration workflows across cloud (AWS) and on-premises environments. Support cloud infrastructure deployment and management using Terraform. Participate in the modernization of legacy systems, including mainframe migration. Perform complex SQL queries and performance tuning on large datasets. Contribute to CI/CD pipelines, version control, and infrastructure automation. Provide expertise, tools, and assistance to operations, development, and support teams for critical production issues and maintenance Troubleshoot production issues, diagnose the problem, and implement a solution - First line of defense in finding the root cause Work cross-functionally with the support team, development team and business team to efficiently address customer issues. Active member of high-performance software development and support team in an agile environment Engaged in fostering and improving organizational culture. Qualifications Required Skills: Strong analytical and technical skills. Proficiency in Databricks – including notebook development, Delta Lake, and Spark-based process. Experience with mainframe modernization or migrating legacy systems to modern data platforms. Strong programming skills, particularly in PySpark for data processing. Familiarity with data warehousing concepts and cloud-native architecture. Solid understanding of Terraform for managing infrastructure as code on AWS. Familiarity with CI/CD practices and tools (e.g., Git, Jenkins). Strong SQL knowledge on OLAP DB platforms (Teradata, Snowflake) and OLTP DB platforms (Oracle, DB2, Postgres, SingleStore). Strong experience with Teradata SQL and Utilities Strong experience with Oracle, Postgres and DB2 SQL and Utilities Develop high quality database solutions Ability to do extensive analysis on complex SQL processes and design skills Ability to analyze existing SQL queries for performance improvements Experience in software development phases including design, configuration, testing, debugging, implementation, and support of large-scale, business centric and process-based applications Proven experience working with diverse teams of technical architects, business users and IT areas on all phases of the software development life cycle. Exceptional analytical and problem-solving skills Structured, methodical approach to systems development and troubleshooting Ability to ramp up fast on a system architecture Experience in designing and developing process-based solutions or BPM (business process management) Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong interpersonal/relationship management skills. Strong time and project management skills. Familiarity with agile methodology including SCRUM team leadership. Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Desire to work in application support space Passion for learning and desire to explore all areas of IT. Required Experience & Education: Minimum of 8-12 years of experience in application development role. Bachelor’s degree equivalent in Information Technology, Business Information Systems, Technology Management, or related field of study. Location & Hours of Work: Hyderabad and Hybrid (13:00 AM IST to 10:00 PM IST) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 6 days ago

Apply

6.0 - 9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary Strategy & Analytics Strategy Our Strategy practice brings together several key capabilities that will allow us to architect integrated programs that transform our clients’ businesses, including Corporate & Business Unit Strategy, Technology Strategy & Insights, Enterprise Model Design, Enterprise Cloud Strategy and Business Transformation. Strategy professionals will serve as trusted advisors to our clients, working with them to make clear data-driven choices about where to play and how to win, in order to drive growth and enterprise value. Strategy will help our clients: Identify strategies for growth and value creation Develop the appropriate business models, operating models, and capabilities to support their strategic vision Maximize the ROI on technology investments and leverage technology and Cloud trends to architect future business strategies AI&Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The Analytics & Cognitive team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. Analytics & Cognitive will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Abinitio Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA 6-9 years of experience in design and implementation of database migration and integration solutions for any Data warehousing project. Required Skills Good knowledge of DBMS concepts, SQL, and PL/SQL. Good knowledge of Snowflake system Hierarchy. Good knowledge of Snowflake schema’s/tables/views/stages etc. Should have Strong hands-on experience with ABINITIO development. Should have strong problem solving and analytical capabilities. Should have hands-on experience in the following: data validation, writing custom SQL code, managing the Snowflake account /users/roles and privileges. Should have experience in integrating using Abinitio with Snowflake, AWS S3 bucket Should have experience in integrating any BI tool like Tableau, Power BI with Snowflake. Should have experience in fine tuning and troubleshooting performance issues. Should be well versed with understanding of design documents like HLD, LLD etc. Should be well versed with Data migration and integration concepts. Should be self-starter in solution implementation with inputs from design documents Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing, etc. Should have hands-on development experience with various Ab Initio components such as Rollup, Scan, j join, Partition by key, Partition by Round Robin, Gather, Merge, Interleave, Lookup, etc. Good knowledge on Designs, codes, tests, debug and document software and enhance existing components to ensure that software meets business needs. Should have participated in Preparing design document for any new development or enhancement of the data mart Constant communication and follow up with stakeholder. Good knowledge in developing UNIX scripts. Should have hands-on Different databases like Teradata, SQL Server. Should have experience on Autosys. Experience in all aspects of Agile SDLC, and end to end participation in a project lifecycle Preferred Skills Exposure to Data Modelling concepts is desirable. Exposure to advanced Snowflake features like Data sharing/Cloning/export and import is desirable. Participation in client interactions/meetings is desirable. Participation in code-tuning is desirable. Exposure to AWS platform is desirable. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300132

Posted 6 days ago

Apply

6.0 - 9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI and Data offering leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI and Data professionals will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Abinitio Senior Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA 6-9 years of experience in design and implementation of database migration and integration solutions for any Data warehousing project. Required Skills Good knowledge of DBMS concepts, SQL, and PL/SQL. Good knowledge of Snowflake system Hierarchy. Good knowledge of Snowflake schema’s/tables/views/stages etc. Should have Strong hands-on experience with ABINITIO development. Should have strong problem solving and analytical capabilities. Should have hands-on experience in the following: data validation, writing custom SQL code, managing the Snowflake account /users/roles and privileges. Should have experience in integrating using Abinitio with Snowflake, AWS S3 bucket Should have experience in integrating any BI tool like Tableau, Power BI with Snowflake. Should have experience in fine tuning and troubleshooting performance issues. Should be well versed with understanding of design documents like HLD, LLD etc. Should be well versed with Data migration and integration concepts. Should be self-starter in solution implementation with inputs from design documents Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing, etc. Should have hands-on development experience with various Ab Initio components such as Rollup, Scan, j join, Partition by key, Partition by Round Robin, Gather, Merge, Interleave, Lookup, etc. Good knowledge on Designs, codes, tests, debug and document software and enhance existing components to ensure that software meets business needs. Should have participated in Preparing design document for any new development or enhancement of the data mart Constant communication and follow up with stakeholder. Good knowledge in developing UNIX scripts. Should have hands-on Different databases like Teradata, SQL Server. Should have experience on Autosys. Experience in all aspects of Agile SDLC, and end to end participation in a project lifecycle Preferred Skills Exposure to Data Modelling concepts is desirable. Exposure to advanced Snowflake features like Data sharing/Cloning/export and import is desirable. Participation in client interactions/meetings is desirable. Participation in code-tuning is desirable. Exposure to AWS platform is desirable. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300129

Posted 6 days ago

Apply

6.0 - 9.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI and Data offering leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI and Data professionals will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Abinitio Senior Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA 6-9 years of experience in design and implementation of database migration and integration solutions for any Data warehousing project. Required Skills Good knowledge of DBMS concepts, SQL, and PL/SQL. Good knowledge of Snowflake system Hierarchy. Good knowledge of Snowflake schema’s/tables/views/stages etc. Should have Strong hands-on experience with ABINITIO development. Should have strong problem solving and analytical capabilities. Should have hands-on experience in the following: data validation, writing custom SQL code, managing the Snowflake account /users/roles and privileges. Should have experience in integrating using Abinitio with Snowflake, AWS S3 bucket Should have experience in integrating any BI tool like Tableau, Power BI with Snowflake. Should have experience in fine tuning and troubleshooting performance issues. Should be well versed with understanding of design documents like HLD, LLD etc. Should be well versed with Data migration and integration concepts. Should be self-starter in solution implementation with inputs from design documents Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing, etc. Should have hands-on development experience with various Ab Initio components such as Rollup, Scan, j join, Partition by key, Partition by Round Robin, Gather, Merge, Interleave, Lookup, etc. Good knowledge on Designs, codes, tests, debug and document software and enhance existing components to ensure that software meets business needs. Should have participated in Preparing design document for any new development or enhancement of the data mart Constant communication and follow up with stakeholder. Good knowledge in developing UNIX scripts. Should have hands-on Different databases like Teradata, SQL Server. Should have experience on Autosys. Experience in all aspects of Agile SDLC, and end to end participation in a project lifecycle Preferred Skills Exposure to Data Modelling concepts is desirable. Exposure to advanced Snowflake features like Data sharing/Cloning/export and import is desirable. Participation in client interactions/meetings is desirable. Participation in code-tuning is desirable. Exposure to AWS platform is desirable. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300129

Posted 6 days ago

Apply

6.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary Strategy & Analytics Strategy Our Strategy practice brings together several key capabilities that will allow us to architect integrated programs that transform our clients’ businesses, including Corporate & Business Unit Strategy, Technology Strategy & Insights, Enterprise Model Design, Enterprise Cloud Strategy and Business Transformation. Strategy professionals will serve as trusted advisors to our clients, working with them to make clear data-driven choices about where to play and how to win, in order to drive growth and enterprise value. Strategy will help our clients: Identify strategies for growth and value creation Develop the appropriate business models, operating models, and capabilities to support their strategic vision Maximize the ROI on technology investments and leverage technology and Cloud trends to architect future business strategies AI&Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The Analytics & Cognitive team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. Analytics & Cognitive will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Abinitio Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA 6-9 years of experience in design and implementation of database migration and integration solutions for any Data warehousing project. Required Skills Good knowledge of DBMS concepts, SQL, and PL/SQL. Good knowledge of Snowflake system Hierarchy. Good knowledge of Snowflake schema’s/tables/views/stages etc. Should have Strong hands-on experience with ABINITIO development. Should have strong problem solving and analytical capabilities. Should have hands-on experience in the following: data validation, writing custom SQL code, managing the Snowflake account /users/roles and privileges. Should have experience in integrating using Abinitio with Snowflake, AWS S3 bucket Should have experience in integrating any BI tool like Tableau, Power BI with Snowflake. Should have experience in fine tuning and troubleshooting performance issues. Should be well versed with understanding of design documents like HLD, LLD etc. Should be well versed with Data migration and integration concepts. Should be self-starter in solution implementation with inputs from design documents Should have participated in different kinds of testing like Unit Testing, System Testing, User Acceptance Testing, etc. Should have hands-on development experience with various Ab Initio components such as Rollup, Scan, j join, Partition by key, Partition by Round Robin, Gather, Merge, Interleave, Lookup, etc. Good knowledge on Designs, codes, tests, debug and document software and enhance existing components to ensure that software meets business needs. Should have participated in Preparing design document for any new development or enhancement of the data mart Constant communication and follow up with stakeholder. Good knowledge in developing UNIX scripts. Should have hands-on Different databases like Teradata, SQL Server. Should have experience on Autosys. Experience in all aspects of Agile SDLC, and end to end participation in a project lifecycle Preferred Skills Exposure to Data Modelling concepts is desirable. Exposure to advanced Snowflake features like Data sharing/Cloning/export and import is desirable. Participation in client interactions/meetings is desirable. Participation in code-tuning is desirable. Exposure to AWS platform is desirable. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300132

Posted 6 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Skillset Coverage (Mandatory) : Strong SQL development background Hands-on experience in Teradata or BigQuery (at least one is mandatory) Ability to debug code and drive fixes end-to-end into production Familiarity with enterprise schedulers – Autosys preferred Exposure to Composer DAG (preferred) Good problem-solving and troubleshooting skills Experience with code deployment, validation, and testing workflows Thanks & Regards Prashant Awasthi Vastika Technologies PVT LTD 9711189829

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies