Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
37 - 40 Lacs
Kochi, Hyderabad, Coimbatore
Work from Office
Key Responsibilities: Design and implement scalable Snowflake data warehouse solutions for structured and semi-structured data. Develop ETL/ELT pipelines using Informatica IICS, dbt, Matillion, Talend, Airflow, or equivalent tools. Optimize query performance and implement best practices for cost and efficiency. Work with cloud platforms (AWS, Azure, GCP) for data integration and storage. Implement role-based access control (RBAC), security policies, and encryption within Snowflake. Perform data modeling (Star Schema, Snowflake Schema, Data Vault) and warehouse design. Collaborate with data engineers, analysts, and business teams to ensure data consistency and availability. Automate Snowflake object creation, pipeline scheduling, and monitoring. Migrate existing on-premise databases (Oracle, SQL Server, Teradata, Redshift, etc.) to Snowflake. Implement data governance, quality checks, and observability frameworks. Required Skills & Qualifications: 6+ years of experience in data engineering / warehousing with at least 2+ years in Snowflake. Strong expertise in Snowflake features such as Virtual Warehouses, Streams, Tasks, Time Travel, and Cloning. Experience in SQL performance tuning, query optimization, and stored procedures (JavaScript UDFs/ UDAFs). Hands-on experience with ETL/ELT tools like Informatica, dbt, Matillion, Talend, Airflow, or AWS Glue. Experience with Python, PySpark, or Scala for data processing. Knowledge of CI/CD pipelines, Git, Terraform, or Infrastructure as Code (IaC). Experience with semi-structured data (JSON, Parquet, Avro) and handling ingestion from APIs. Strong understanding of cloud platforms (AWS S3, Azure Data Lake, GCP BigQuery) and data lake architectures. Familiarity with BI/Analytics tools like Tableau, Power BI, Looker, or ThoughtSpot. Strong problem-solving skills and experience working in Agile/Scrum environments.
Posted 1 month ago
8.0 - 13.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Role & responsibilities Designing and developing Teradata databases Configuring and testing Teradata systems Troubleshooting Teradata systems Liaising with Teradata support staff and other technical teams Providing training and support to end users Requirements & Skills B.Tech/BE in Computer Science or related field 5+ years of experience in Teradata development Strong experience of SQL Good understanding of data warehousing concepts Experience in using Teradata utilities Excellent problem-solving skills Preferred candidate profile Immediate Joiners Only/Open for Bangalore location
Posted 1 month ago
5.0 years
6 - 24 Lacs
Chennai, Tamil Nadu, India
On-site
Teradata SQL Developer Operating at the intersection of Enterprise Data Warehousing and Financial Services analytics, we help blue-chip clients unlock actionable insights through high-volume Teradata platforms. Our on-premise engineering team builds high-performance SQL solutions that power regulatory reporting, customer 360 views, and real-time decision systems across APAC markets. Role & Responsibilities Design, develop, and optimize Teradata SQL scripts and BTEQ batches for large scale data warehouses. Implement performance tuning strategies including statistics management, index design, and partition optimization. Build robust ETL workflows integrating Informatica/Python and Unix shell to ingest structured and semi-structured data. Collaborate with data modelers and analysts to translate business logic into efficient logical and physical models. Automate deployment, scheduling, and monitoring using Teradata Viewpoint or Control-M for 24x7 stability. Provide L3 production support, root cause analysis, and knowledge transfer to client stakeholders. Skills & Qualifications Must-Have 5+ years hands-on Teradata SQL development for enterprise data warehouses. Deep expertise in BTEQ, FastLoad, MultiLoad, and TPT utilities. Solid grasp of query plan analysis, Collect Stats, and primary/secondary index design. Proficiency in Unix shell scripting and at least one ETL tool such as Informatica or Talend. Ability to debug, profile, and optimize workloads exceeding 5 TB. Preferred Exposure to Data Vault or 3NF modeling methodologies. Knowledge of Python or Spark for data transformation. Benefits & Culture On-site client engagement offering direct business impact and rapid career growth. Learning budget for advanced Teradata certifications and cloud migration skills. Collaborative, merit-based environment with hackathons and internal guilds. Work Location: India (on-site, Monday to Friday) Skills: unix shell scripting,multiload,fastload,primary/secondary index design,tpt utilities,spark,teradata sql,python,teradata,etl,bteq,query optimization,performance tuning,query plan analysis,informatica,collect stats,data modeling,data warehousing
Posted 1 month ago
5.0 - 10.0 years
25 - 35 Lacs
Bengaluru
Work from Office
We are one of Australias leading integrated media companies, with major operations in broadcast television, publishing, and digital content. They own Channel 7, The West Australian newspaper, and 7plus, a streaming platform. Their portfolio includes partnerships with leading global media brands, reaching millions of Australians across various media channels. Role - Senior Data Engineer Responsibilities : Data Acquisition : Proactively design and implement processes for acquiring data from both internal systems and external data providers. Understand the various data types involved in the data lifecycle, including raw, curated, and lake data, to ensure effective data integration. SQL Development : Develop advanced SQL queries within database frameworks to produce semantic data layers that facilitate accurate reporting. This includes optimizing queries for performance and ensuring data quality. Linux Command Line : Utilize Linux command-line tools and functions, such as bash shell scripts, cron jobs, grep, and awk, to perform data processing tasks efficiently. This involves automating workflows and managing data pipelines. Data Protection : Ensure compliance with data protection and privacy requirements, including regulations like GDPR. This includes implementing best practices for data handling and maintaining the confidentiality of sensitive information. Documentation : Create and maintain clear documentation of designs and workflows using tools like Confluence and Visio. This ensures that stakeholders can easily communicate and understand technical specifications. API Integration and Data Formats : Collaborate with RESTful APIs and AWS services (such as S3, Glue, and Lambda) to facilitate seamless data integration and automation. Demonstrate proficiency in parsing and working with various data formats, including CSV and Parquet, to support diverse data processing needs. Key Requirements: 5+ years of experience as a Data Engineer , focusing on ETL development. 3+ years of experience in SQL and writing complex queries for data retrieval and manipulation. 3+ years of experience in Linux command-line and bash scripting. Familiarity with data modelling in analytical databases. Strong understanding of backend data structures, with experience collaborating with data engineers ( Teradata, Databricks, AWS S3 parquet/CSV ). Experience with RESTful APIs and AWS services like S3, Glue, and Lambda Experience using Confluence for tracking documentation. Strong communication and collaboration skills, with the ability to interact effectively with stakeholders at all levels. Ability to work independently and manage multiple tasks and priorities in a dynamic environment. Bachelors degree in Computer Science, Engineering, Information Technology, or a related field. Good to Have: Experience with Spark and Databricks. Understanding of data visualization tools, particularly Tableau. Knowledge of data clean room techniques and integration methodologies.
Posted 1 month ago
5.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Role & responsibilities 5+ years of experience as a Data Engineer , focusing on ETL development. 3+ years of experience in SQL and writing complex queries for data retrieval and manipulation. 3+ years of experience in Linux command-line and bash scripting. Familiarity with data modelling in analytical databases. Strong understanding of backend data structures, with experience collaborating with data engineers ( Teradata, Databricks, AWS S3 parquet/CSV ). Experience with RESTful APIs and AWS services like S3, Glue, and Lambda Experience using Confluence for tracking documentation. Strong communication and collaboration skills, with the ability to interact effectively with stakeholders at all levels. Ability to work independently and manage multiple tasks and priorities in a dynamic environment. Bachelors degree in Computer Science, Engineering, Information Technology, or a related field.
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Responsibilities JOB DESCRIPTION Develop, optimize, and maintain complex SQL queries, stored procedures, functions, and views. Analyze slow-performing queries and optimize execution plans to improve database performance. Design and implement indexing strategies to enhance query efficiency. Work with developers to optimize database interactions in applications. Develop and implement Teradata best practices for large-scale data processing and ETL workflows. Monitor and troubleshoot Teradata performance issues using tools like DBQL (Database Query Log), Viewpoint, and Explain Plan Analysis. Perform data modeling, normalization, and schema design improvements. Collaborate with teams to implement best practices for database tuning and performance enhancement. Automate repetitive database tasks using scripts and scheduled jobs. Document database architecture, queries, and optimization techniques. Responsibilities Required Skills & Qualifications: Strong proficiency in Teradata SQL, including query optimization techniques. Strong proficiency in SQL (T-SQL, PL/SQL, or equivalent). Experience with indexing strategies, partitioning, and caching techniques. Knowledge of database normalization, denormalization, and best practices. Familiarity with ETL processes, data warehousing, and large datasets. Experience in writing and optimizing stored procedures, triggers, and functions. Hands-on experience in Teradata performance tuning, indexing, partitioning, and statistics collection. Experience with EXPLAIN plans, DBQL analysis, and Teradata Viewpoint monitoring. Candidate should have PowerBI / Tableau integration experience - Good to Have About Us ABOUT US Bristlecone is the leading provider of AI-powered application transformation services for the connected supply chain. We empower our customers with speed, visibility, automation, and resiliency – to thrive on change. Our transformative solutions in Digital Logistics, Cognitive Manufacturing, Autonomous Planning, Smart Procurement and Digitalization are positioned around key industry pillars and delivered through a comprehensive portfolio of services spanning digital strategy, design and build, and implementation across a range of technology platforms. Bristlecone is ranked among the top ten leaders in supply chain services by Gartner. We are headquartered in San Jose, California, with locations across North America, Europe and Asia, and over 2,500 consultants. Bristlecone is part of the $19.4 billion Mahindra Group. Equal Opportunity Employer Bristlecone is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status . Information Security Responsibilities Understand and adhere to Information Security policies, guidelines and procedure, practice them for protection of organizational data and Information System. Take part in information security training and act while handling information. Report all suspected security and policy breach to InfoSec team or appropriate authority (CISO). Understand and adhere to the additional information security responsibilities as part of the assigned job role.
Posted 1 month ago
3.0 years
6 - 27 Lacs
Delhi, India
On-site
About The Opportunity A fast-growing player in the Data & Analytics consulting sector, we build cloud-native data platforms and real-time reporting solutions for enterprises in retail, BFSI, and healthcare. Leveraging Google Cloud’s advanced analytics stack, we turn high-volume data into actionable insights that accelerate digital transformation and revenue growth. Role & Responsibilities Design, develop, and optimize BigQuery data warehouses for petabyte-scale analytics. Build ingestion pipelines using Dataflow, Pub/Sub, and Cloud Storage to ensure reliable, low-latency data availability. Implement ELT/ETL workflows in Python and SQL, applying best practices for partitioning, clustering, and cost control. Create and orchestrate DAGs in Cloud Composer/Airflow to automate data processing and quality checks. Collaborate with analysts and business stakeholders to model datasets, define SLAs, and deliver high-impact dashboards. Harden production environments with CI/CD, Terraform, monitoring, and automated testing for zero-defect releases. Skills & Qualifications Must-Have 3+ years building data pipelines on Google Cloud Platform. Expert hands-on experience with BigQuery optimisation and SQL performance tuning. Proficiency in Python scripting for data engineering tasks. Deep understanding of Dataflow or Apache Beam streaming and batch paradigms. Solid grasp of data warehousing principles, partitioning, and metadata management. Version control, containerisation, and CI/CD exposure (Git, Docker, Cloud Build). Preferred Terraform/IaC for infrastructure provisioning. Experience migrating on-prem warehouses (Teradata, Netezza) to BigQuery. Knowledge of data governance, DLP, and security best practices in GCP. Benefits & Culture Highlights Industry-leading GCP certifications paid and supported. Product-grade engineering culture with peer mentorship and hackathons. On-site, collaboration-rich workplace designed for learning and innovation. Skills: apache beam,ci/cd,sql,python,git,cloud storage,dataflow,docker,bigquery,airflow,cloud build,terraform,data warehousing,gcp data engineer (bigquery)
Posted 1 month ago
3.0 years
6 - 27 Lacs
Chennai, Tamil Nadu, India
On-site
About The Opportunity A fast-growing player in the Data & Analytics consulting sector, we build cloud-native data platforms and real-time reporting solutions for enterprises in retail, BFSI, and healthcare. Leveraging Google Cloud’s advanced analytics stack, we turn high-volume data into actionable insights that accelerate digital transformation and revenue growth. Role & Responsibilities Design, develop, and optimize BigQuery data warehouses for petabyte-scale analytics. Build ingestion pipelines using Dataflow, Pub/Sub, and Cloud Storage to ensure reliable, low-latency data availability. Implement ELT/ETL workflows in Python and SQL, applying best practices for partitioning, clustering, and cost control. Create and orchestrate DAGs in Cloud Composer/Airflow to automate data processing and quality checks. Collaborate with analysts and business stakeholders to model datasets, define SLAs, and deliver high-impact dashboards. Harden production environments with CI/CD, Terraform, monitoring, and automated testing for zero-defect releases. Skills & Qualifications Must-Have 3+ years building data pipelines on Google Cloud Platform. Expert hands-on experience with BigQuery optimisation and SQL performance tuning. Proficiency in Python scripting for data engineering tasks. Deep understanding of Dataflow or Apache Beam streaming and batch paradigms. Solid grasp of data warehousing principles, partitioning, and metadata management. Version control, containerisation, and CI/CD exposure (Git, Docker, Cloud Build). Preferred Terraform/IaC for infrastructure provisioning. Experience migrating on-prem warehouses (Teradata, Netezza) to BigQuery. Knowledge of data governance, DLP, and security best practices in GCP. Benefits & Culture Highlights Industry-leading GCP certifications paid and supported. Product-grade engineering culture with peer mentorship and hackathons. On-site, collaboration-rich workplace designed for learning and innovation. Skills: apache beam,ci/cd,sql,python,git,cloud storage,dataflow,docker,bigquery,airflow,cloud build,terraform,data warehousing,gcp data engineer (bigquery)
Posted 1 month ago
3.0 years
6 - 27 Lacs
Hyderabad, Telangana, India
On-site
About The Opportunity A fast-growing player in the Data & Analytics consulting sector, we build cloud-native data platforms and real-time reporting solutions for enterprises in retail, BFSI, and healthcare. Leveraging Google Cloud’s advanced analytics stack, we turn high-volume data into actionable insights that accelerate digital transformation and revenue growth. Role & Responsibilities Design, develop, and optimize BigQuery data warehouses for petabyte-scale analytics. Build ingestion pipelines using Dataflow, Pub/Sub, and Cloud Storage to ensure reliable, low-latency data availability. Implement ELT/ETL workflows in Python and SQL, applying best practices for partitioning, clustering, and cost control. Create and orchestrate DAGs in Cloud Composer/Airflow to automate data processing and quality checks. Collaborate with analysts and business stakeholders to model datasets, define SLAs, and deliver high-impact dashboards. Harden production environments with CI/CD, Terraform, monitoring, and automated testing for zero-defect releases. Skills & Qualifications Must-Have 3+ years building data pipelines on Google Cloud Platform. Expert hands-on experience with BigQuery optimisation and SQL performance tuning. Proficiency in Python scripting for data engineering tasks. Deep understanding of Dataflow or Apache Beam streaming and batch paradigms. Solid grasp of data warehousing principles, partitioning, and metadata management. Version control, containerisation, and CI/CD exposure (Git, Docker, Cloud Build). Preferred Terraform/IaC for infrastructure provisioning. Experience migrating on-prem warehouses (Teradata, Netezza) to BigQuery. Knowledge of data governance, DLP, and security best practices in GCP. Benefits & Culture Highlights Industry-leading GCP certifications paid and supported. Product-grade engineering culture with peer mentorship and hackathons. On-site, collaboration-rich workplace designed for learning and innovation. Skills: apache beam,ci/cd,sql,python,git,cloud storage,dataflow,docker,bigquery,airflow,cloud build,terraform,data warehousing,gcp data engineer (bigquery)
Posted 1 month ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team is comprised of many talented individuals all working together with cutting-edge technology to build the best airline in the history of aviation. Our team designs, develops and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Job Overview And Responsibilities United Airlines’ Enterprise Data Analytics department partners with business and technology leaders across the company to transform data analytics into a competitive advantage. An offshore team based in Delhi, India will work closely with this group and support it with complementing skills and capabilities. The key objectives are to improve operating performance, boost customer experience and drive incremental revenue by embedding data in decision making across all levels of the organization. The team is currently looking for a leader who has a passion for data and analytics with the willingness to dig deep into details as well as the ability to assess the big picture. Developing and maintaining strong relationships with key stakeholders in US as well as training and retaining key talent within the offshore team are keys to success in this role. This role will require strategic thinking and strong client focus. Manage a team of data analysts by guiding them on modeling techniques, approaches and methodologies Execute solutions to business problems using data analysis, data mining, optimization tools, statistical modeling and machine learning techniques Continuously develop and demonstrate improved analysis methodologies Ensure alignment and prioritization with business objectives and initiatives – help business owners make faster, smarter decisions Create and develop presentations for United leadership and external stakeholders Encourage development and sharing of internal best practices and foster collaboration with internal and external teams This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree required At least 6+ years of experience in analytics required At least 2+ years of experience in supervisory role Be experienced in manipulating and analyzing complex, high-volume, high dimensionality data from various sources to highlight patterns and relationships Proven comfort and an intellectual curiosity for working with very large sets of data, pulling in relevant team members to address identified – and sometimes undiscovered Be able to communicate complex quantitative analysis and algorithms in a clear, precise and actionable manner Be adept at juggling several projects and initiatives simultaneously through appropriate prioritization Be proficient in using database querying tools and able to write complex queries and procedures using Teradata SQL and/or Microsoft TSQL Be familiar with one or more reporting tools – Spotfire / Slate Be able to communicate complex quantitative analysis and algorithms in a clear, precise and actionable manner Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's Degree in a quantitative field like Math, Statistics and/or MBA preferred Hands on experience in setting up using Big Data ecosystems like Hadoop/Spark Have extensive knowledge of predictive modeling, test design and Database querying Strong knowledge of either R or Python Basic programming skills for web scraping and experience of working with non-structured data will be a plus Deep technical experience in distributed computing, machine learning, and statistics related work GGN00002100
Posted 1 month ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team is comprised of many talented individuals all working together with cutting-edge technology to build the best airline in the history of aviation. Our team designs, develops and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Job Overview And Responsibilities United Airlines’ Enterprise Data Analytics department partners with business and technology leaders across the company to transform data analytics into a competitive advantage. An offshore team based in Delhi, India will work closely with this group and support it with complementing skills and capabilities. The key objectives are to improve operating performance, boost customer experience and drive incremental revenue by embedding data in decision making across all levels of the organization. The team is currently looking for a leader who has a passion for data and analytics with the willingness to dig deep into details as well as the ability to assess the big picture. Developing and maintaining strong relationships with key stakeholders in US as well as training and retaining key talent within the offshore team are keys to success in this role. This role will require strategic thinking and strong client focus. High-level responsibilities of the role include:" Execute solutions to business problems using data analysis, data mining, optimization tools, statistical modeling and machine learning techniques Continuously develop and demonstrate improved analysis methodologies Ensure alignment and prioritization with business objectives and initiatives – help business owners make faster, smarter decisions Sharing of internal best practices and foster collaboration with internal and external teams Create and develop presentations for United leadership and external stakeholders This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree required At least 2+ years of experience in analytics required Proven comfort and an intellectual curiosity for working with very large sets of data, pulling in relevant team members to address identified – and sometimes undiscovered Strong knowledge of either R or Python Be proficient in using database querying tools and able to write complex queries and procedures using Teradata SQL and/or Microsoft TSQL Be experienced in manipulating and analyzing complex, high-volume, high dimensionality data from various sources to highlight patterns and relationships Be familiar with one or more reporting tools – Spotfire / Tableau 4Be able to communicate complex quantitative analysis and algorithms in a clear, precise and actionable manner Exhibit written and spoken English fluency Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's Degree in a quantitative field like Math, Statistics and/or MBA Hands on experience with Big Data products will be a big plus Basic programming skills for web scraping and experience of working with non-structured data will be a plus GGN00002102
Posted 1 month ago
3.0 years
6 - 27 Lacs
Pune, Maharashtra, India
On-site
About The Opportunity A fast-growing player in the Data & Analytics consulting sector, we build cloud-native data platforms and real-time reporting solutions for enterprises in retail, BFSI, and healthcare. Leveraging Google Cloud’s advanced analytics stack, we turn high-volume data into actionable insights that accelerate digital transformation and revenue growth. Role & Responsibilities Design, develop, and optimize BigQuery data warehouses for petabyte-scale analytics. Build ingestion pipelines using Dataflow, Pub/Sub, and Cloud Storage to ensure reliable, low-latency data availability. Implement ELT/ETL workflows in Python and SQL, applying best practices for partitioning, clustering, and cost control. Create and orchestrate DAGs in Cloud Composer/Airflow to automate data processing and quality checks. Collaborate with analysts and business stakeholders to model datasets, define SLAs, and deliver high-impact dashboards. Harden production environments with CI/CD, Terraform, monitoring, and automated testing for zero-defect releases. Skills & Qualifications Must-Have 3+ years building data pipelines on Google Cloud Platform. Expert hands-on experience with BigQuery optimisation and SQL performance tuning. Proficiency in Python scripting for data engineering tasks. Deep understanding of Dataflow or Apache Beam streaming and batch paradigms. Solid grasp of data warehousing principles, partitioning, and metadata management. Version control, containerisation, and CI/CD exposure (Git, Docker, Cloud Build). Preferred Terraform/IaC for infrastructure provisioning. Experience migrating on-prem warehouses (Teradata, Netezza) to BigQuery. Knowledge of data governance, DLP, and security best practices in GCP. Benefits & Culture Highlights Industry-leading GCP certifications paid and supported. Product-grade engineering culture with peer mentorship and hackathons. On-site, collaboration-rich workplace designed for learning and innovation. Skills: apache beam,ci/cd,sql,python,git,cloud storage,dataflow,docker,bigquery,airflow,cloud build,terraform,data warehousing,gcp data engineer (bigquery)
Posted 1 month ago
0.0 years
6 - 9 Lacs
Hyderābād
On-site
Our vision is to transform how the world uses information to enrich life for all . Micron Technology is a world leader in innovating memory and storage solutions that accelerate the transformation of information into intelligence, inspiring the world to learn, communicate and advance faster than ever. Responsibilities and Tasks: Understand the Business Problem and the Relevant Data Maintain an intimate understanding of company and department strategy Translate analysis requirements into data requirements Identify and understand the data sources that are relevant to the business problem Develop conceptual models that capture the relationships within the data Define the data-quality objectives for the solution Be a subject matter expert in data sources and reporting options Architect Data Management Systems: Design and implement optimum data structures in the appropriate data management system (Hadoop, Teradata, SQL Server, etc.) to satisfy the data requirements Plan methods for archiving/deletion of information Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data. Identify and select the optimum methods of access for each data source (real-time/streaming, delayed, static) Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model Develop processes to efficiently load the transform data into the data management system Prepare Data to Meet Analysis Requirements: Work with the data scientist to implement strategies for cleaning and preparing data for analysis (e.g., outliers, missing data, etc.) Develop and code data extracts Follow standard methodologies to ensure data quality and data integrity Ensure that the data is fit to use for data science applications Qualifications and Experience: 0-7 years of experience developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.) Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake Significant experience with big data processing and/or developing applications and data sources via Hadoop, Yarn, Hive, Pig, Sqoop, MapReduce, HBASE, Flume, etc. Understanding of how distributed systems work Familiarity with software architecture (data structures, data schemas, etc.) Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL. Strong mathematics background, analytical, problem solving, and organizational skills Strong communication skills (written, verbal and presentation) Experience working in a global, multi-functional environment Minimum of 2 years’ experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.); one or more Data Extraction Tools (SSIS, Informatica etc.) Software development. Ability to travel as needed Education: B.S. degree in Computer Science, Software Engineering, Electrical Engineering, Applied Mathematics or related field of study. M.S. degree preferred. About Micron Technology, Inc. We are an industry leader in innovative memory and storage solutions transforming how the world uses information to enrich life for all . With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence and 5G applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more, please visit micron.com/careers All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. To request assistance with the application process and/or for reasonable accommodations, please contact hrsupport_india@micron.com Micron Prohibits the use of child labor and complies with all applicable laws, rules, regulations, and other international and industry labor standards. Micron does not charge candidates any recruitment fees or unlawfully collect any other payment from candidates as consideration for their employment with Micron. AI alert : Candidates are encouraged to use AI tools to enhance their resume and/or application materials. However, all information provided must be accurate and reflect the candidate's true skills and experiences. Misuse of AI to fabricate or misrepresent qualifications will result in immediate disqualification. Fraud alert: Micron advises job seekers to be cautious of unsolicited job offers and to verify the authenticity of any communication claiming to be from Micron by checking the official Micron careers website in the About Micron Technology, Inc.
Posted 1 month ago
4.0 years
3 - 4 Lacs
Hyderābād
Remote
Overview: As an Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities: Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. • Governs data design/modeling – documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications: 4+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).
Posted 1 month ago
4.0 years
2 - 5 Lacs
Hyderābād
On-site
About this role: Wells Fargo is seeking a Senior Software Engineer. We believe in the power of working together because great ideas can come from anyone. Through collaboration, any employee can have an impact and make a difference for the entire company. Explore opportunities with us for a career in a supportive environment where you can learn and grow." In this role, you will: Lead moderately complex initiatives and deliverables within technical domain environments Contribute to large scale planning of strategies Design, code, test, debug, and document for projects and programs associated with technology domain, including upgrades and deployments Review moderately complex technical challenges that require an in-depth evaluation of technologies and procedures Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements Collaborate and consult with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals Lead projects and act as an escalation point, provide guidance and direction to less experienced staff Required Qualifications: 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training or education. Desired Qualifications: Analyze, code, test, debug, and document enhancements to data warehouse applications. Responsible for the implementation of data flows to connect information security data sources for analytics and business intelligence systems Work within an agile team model to deliver application enhancement releases. Self-assign stories from a product backlog and collaborate with the Scrum Master / Kanban Lead, Product Owners, Engineers, and User Acceptance Testers to deliver on business stories. Relevant and Good experience in relational databases, querying, data warehousing, ETL process, requirements gathering and/or decision support tools. Hands-on experience in SQL especially in SQL Server & Teradata environment Hands on experience in ETL development (Ab Initio) Work with a variety of data ingestion patterns (NDM files, direct DB connections and APIs) to receive data in the warehouse. Collaborate with peers, colleagues, and managers to resolve technical challenges and achieve goals. Ability to provide adequate support for resolution of production issues. 3+ years of ETL Development experience (Ab Initio) 2+ years of Teradata experience 2+ years of experience with SQL Experience designing and optimizing complex SQL and/or SAS queries Experience working in operational reporting Experience with Agile Scrum (Daily Standup, Planning and Retrospective meetings) and Kanban 2+ years of experience with modern software engineering technologies and tool sets 3+ years of experience in ETL Development Posting End Date: 6 Jul 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 1 month ago
3.0 years
6 - 10 Lacs
Hyderābād
On-site
Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What You’ll Do Develop and maintain fully automated tests to validate the Teradata Vantage Product across various platforms, components, and functions, mimicking customer-like scenarios and workloads. Develop tests to reproduce and verify customer reported issues and fixes Identify test escapes and develop test scenarios to address gaps. Perform customer-centric validation of Vantage products at an integration level to ensure system integration, reliability, and resiliency. Independently analyze and report defects/issues encountered during testing. Who You’ll Work With You will be part of the integration test team, focusing on strengthening product quality before release to customers. Our team comprises brilliant minds with deep expertise in the product and platforms from a customer perspective. Acting as a strong quality gate, we closely collaborate with various Product teams and Customer Experience teams. You will be reporting to the Senior Manager, Software Engineering. What Makes You a Qualified Candidate 3+ years of industry experience in validating core database and analytics capabilities, preferably in integration and acceptance test phases Graduate or Postgraduate in Computer Science or Electronics with knowledge on Database concepts & SQL Experience in customer-centric testing on AWS, Azure, GC or VMWare Platforms to validate system’s reliability and resiliency Experience with Python, OOPS concepts, Unix/Linux including system administration Knowledge of CI/CD tool chain - Jenkins, GitHub, etc. Familiarity with AI/LLM and Analytics Must have strong debugging skills, oral and written communication skills Ability to learn new technologies and tools quickly and to leverage that knowledge for results analysis and problem solving Strong understanding of test processes What You’ll Bring Experience in developing customer-centric automated test cases, preferably by leveraging AI capabilities. Experience in test automation frameworks and CICD pipeline. Experience in integration testing - analyzing the test results, debugging and cross-team collaboration to identify product defects knowledge of software engineering practices and metrics Ability to impact and influence without authority Ability to follow documented specifications and plans #LI-VB1
Posted 1 month ago
2.0 - 3.0 years
6 - 7 Lacs
Hyderābād
On-site
AI Engineer: Shape the Future of Autonomous Intelligence About Teradata At Teradata, we don’t just manage data—we unlock its full potential. Our ClearScape Analytics™ platform and pioneering Enterprise Vector Store empower the world’s largest organizations to extract transformative value from their most complex data. We’re at the forefront of innovation in Artificial Intelligence, especially in the dynamic space of autonomous and agentic systems. Ready to learn from industry leaders and make a real impact? The Opportunity: Dive into Enterprise Agentic AI Are you a curious, motivated individual passionate about the next frontier of AI? Have you developed intelligent systems that can reason, learn, and act autonomously? Join us as an AI Engineer and help shape the future of enterprise intelligence using agentic AI. You’ll collaborate with other experienced AI engineers and data scientists on cutting-edge projects that redefine how businesses harness data. This is a hands-on opportunity to apply your expertise and contribute to the development of intelligent agents and agentic AI workflows that drive insights and automate complex engineering flows. What You’ll Do Improve Engineering Efficiency : Collaborate with a passionate team of engineers to explore diverse engineering processes and identify opportunities to enhance efficiency using agentic AI Build Agentic Systems : Contribute to the design, implementation, and testing of components for robust AI agents and multi-agent orchestration. Leverage Vector Stores : Work with Teradata’s Enterprise Vector Store to develop intelligent retrieval-augmented generation (RAG) pipelines. Work with Real-World Data : Gain experience processing large-scale, complex datasets within the Teradata ecosystem. Research & Prototype : Engage with the latest research in agentic AI, prompt engineering, and autonomous systems to prototype innovative ideas. System Integration : Help integrate LLM-based agents with retrieval tools, structured/unstructured inputs, and downstream Teradata products. Who You Are 2-3 years of experience in developing solutions leveraging LLMs for complex business processes. Hands-on experience in developing and deploying agentic AI systems that automate and optimize manual engineering workflows by leveraging orchestration frameworks, or multi-step reasoning workflows. Holding Master’s or Ph.D. in Data Science, Artificial Intelligence, or a related field. Hand-on experience with LLM APIs (e.g., OpenAI, Claude, Gemini) and agent toolkits (e.g., AgentBuilder, AutoGen, LangGraph, CrewAI). Understanding of chain-of-thought reasoning, prompt tuning, or context window management. Knowledge of evaluation metrics for agent performance, latency, and reliability. Hands-on with Python and Cloud automation. Having experience in developing full stack applications is a plus. Passionate about innovation and advancing the state of AI. Curious about text, data, workflows, and multimodal reasoning. Why to join Teradata? Real-World Impact : Work on meaningful projects that address complex enterprise challenges. Innovative Technology : Gain deeper experience with Agentic AI, Generative AI, and large-scale data platforms. Mentorship & Development : Learn from experienced professionals in AI, machine learning, and data engineering. Collaborative Culture : Join a supportive, inclusive team that values creativity and continuous learning. Career Progression : Build a strong portfolio and skill set for a future in advanced AI. #LI-VB1
Posted 1 month ago
0.0 - 1.0 years
0 Lacs
Hyderābād
On-site
Accounting Specialist – Teradata What You’ll Do: The Accounting Specialist will Ensure the monthly accounts for your assigned country are submitted accurately and on time Ensure all adjustments are submitted and processed in a timely manner Ensure all subledgers are interfaced correctly with the General Ledger Researching and resolving accounting issues for your assigned country Perform account reconciliation for your assigned country Who You’ll Work With: As an accountant within the Indian Finance Centre, you will play a key role in providing accounting services to our Teradata organisations in APAC or America (which may change to other regions as per business requirement). Working with the Team Leader and other team members, you will be responsible for the delivery of accurate and timely accounting information. This will often involve close collaboration with other specialized departments. Minimum Requirements: Bachelor’s degree in Commerce. You have 0 to 1 year experience within any Indian company Strong in Accounting concepts & principles You have solid familiarity with Microsoft Office products and Outlook Intermediate to Advance Excel skills Experience in Bank Reconciliations would be preferred Excellent organization & time management skills Strong attention to detail & Ability to multi-task Solid problem-solving and decision-making skills You are good in English You are energetic, results oriented with a “can do” attitude Ready to work in any shifts especially US shifts What You’ll Bring: Ability to collaborate and partner with other team members and BUs to provide an overall superior level of service Ability to “take the lead” in researching and resolving issues, as needed Ability to take ownership of special projects and effectively deliver positive results Technical and comprehensive knowledge of Finance & Accounting systems and processing Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. #LI-NT1
Posted 1 month ago
0 years
0 Lacs
India
On-site
Greetings: This is an IMMEDIATE NEED – SQL SSIS we need folks who can start immediately. Locals Preferred. Must have at least 8 plus years prior hands-on experience developing SQL Server 2012 based applications using industry standards, specifically SQL Server Integration Services (SSIS) Primary responsibilities : - - Experience writing packages, stored procedures, triggers, ad-hoc queries, process documentation. - Troubleshoot and optimize the performance of packages and stored procedures. Must be detail oriented to succeed in this role. Required Skills: Must have extensive experience with SQL Server Integration Services . Should have worked with DB2 or Teradata Strong Design Sills ; Be able to understand structures Expertise in Data Mappings Excellent knowledge of databases, data warehouse/modeling concepts and understanding of business intelligence. Strong knowledge of creating SSIS packages using transformation. Must have extensive experience with Stored Procedures, Triggers, Views, and Functions. Experience building complex reports using Query/Report. Query performance analysis and tuning. Excellent problem solving and critical thinking skills. Excellent analytical skills, documentation, and ability to work in teams and/or independently. Fast learner and excellent documentation habits. Job Type: Full-time Schedule: Day shift Monday to Friday Work Location: In person
Posted 1 month ago
2.0 years
7 - 9 Lacs
Hyderābād
On-site
Our Company At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What You’ll Do In this role, you will engage in designing, coding, testing, and documenting major software component bugs following a standard development process. Gain in-depth skills in the existing product through reverse engineering the code and developing a solid understanding of design and development for enhancements and bug fixes. Projects will involve multiple team members, and the candidate will be expected to collaborate closely within and across teams. Primary responsibilities include triaging reported issues and resolving bugs across various Teradata database areas. Who You’ll Work With Our team consists of approximately ten engineers with exceptional coding and debugging skills in Teradata. You will be required to collaborate closely within and between groups. This team plays a crucial role in diminishing defects/bugs in various areas of the Teradata database core, as well as triaging and fixing database frontline activities. All team members will report to the senior engineering manager. What Makes You a Qualified Candidate B.E/B.Tech/M.Tech in Computer Science Minimum of 2 to 4 years of experience in Sustaining/Development of medium to large projects. Strong experience in programming with C, C++ and Data structures What You'll Bring Strong experience in programming with C, C++ and Data structures Excellent creative, analytical, problem solving and strong debugging skills. Experience in GBD skills Ability to independently come up with algorithmic solutions based on complexity and performance considerations. RDBMS Knowledge and SQL. Good working knowledge in Linux Experience with or knowledge of the design and construction of database engine software, which should include SQL. Understanding and working with complex parallel software or system programming is an added advantage. Works well in a team environment and experienced should have good software development skills including design specification, coding, testing. #LI-SK3
Posted 1 month ago
2.0 - 4.0 years
6 - 8 Lacs
Gurgaon
On-site
Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description United's Digital Technology team is comprised of many talented individuals all working together with cutting-edge technology to build the best airline in the history of aviation. Our team designs, develops and maintains massively scaling technology solutions brought to life with innovative architectures, data analytics, and digital solutions. Job overview and responsibilities Provides data-driven audit engagements within Internal Audit department. This position will primarily work with Audit management & Project Leads to develop risk-based audit programs, ensure audit teams obtain data related to scope, develop testing and data analytics procedures that provide more efficient and extensive population coverage, and design data visualization dashboards to accompany published audit reports Plan, coordinate, conduct, and document audits/reviews in accordance with Internal Audit and IPPF Standards Demonstrates project management experience through using available tools and technology and escalate project management issues as needed Drives the innovative use of analytics through direct participation in all phases of audits (planning, fieldwork and reporting) Independently lead internal and external stakeholder meetings and be able to identify and document risks and control environment Ability to independently manage key internal and external stakeholders Design and support digital dashboards that visualize audit results and findings Participates in report writing, suggests remediation plans for identified risks though a collaborative discussion with stakeholders Prepare and present to IA leadership the results, recommendations, and conclusions of analytics which may be performed Maintaining functionality and managing departmental access to analytics specific software and hardware, including Spotfire library & private SQL servers This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor’s degree Project Management experience and ability to toggle multiple projects Familiar with data analytics and visualization tools such as Power BI, Alteryx, Python (including Pandas and Jupyter), Power Query Knowledge of and skill in applying auditing principles and practices Experience using database querying tools and able to write queries and procedures using Teradata SQL and/or Microsoft TSQL 2-4 years of extensive experience in risk management, data analytics and technology functions Ability to quickly develop an understanding of business processes, risks and controls, and driving useful insights & analytics Ability to identify and develop solutions to streamline or automate testing procedures Ability to document and communicate control deficiencies in a clear, precise, and actionable manner Willing to accommodate U.S. work hours when needed Willingness and ability to travel globally when required Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Quantitative field like Math, Statistics, Operations Research and/or MBA preferred Experience in modeling/ machine learning required
Posted 1 month ago
0 years
2 - 4 Lacs
Noida
Remote
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant – Production Support As Analyst IT Operations, you will be involved in 24*7 L2 Production Support in an onshore-offshore model and is assigned to receive, analyze and identify solutions for all types of priority production issues/tickets. You will be the Primary technical got-to resource, work in shifts on rotation basis, and may have to provide on-call support in 24*7 environment, and during non-business hours when needed. You should be able to handle the technical issues independently and must be able to exhibit being hands-on, pro-active, with flexible schedule and quick adapter to meet organizational needs. Specific responsibilities of the beneficiary include (but not limited to), supporting the production support team and their work assignments, service levels, design & execute process improvements and undertake tasks as needed to allow the production support Manager to focus on the overall service improvements. Resource will be the key in identifying remediation tasks and implement solutions to reduce number of production incidents, Interaction with business users for requirement clarification and resolution of incidents reported, responsible for root cause analysis for major incidents. Resource will be the key in providing workaround solution of production batch issues, assign tickets to team members, track fixing of issues and verification of the break-fixes, performing root cause analysis of issues and arriving at a permanent fix to avoid recurrence of the issues, process improvement to improve system stability and performance. Ensure compliance with corporate standards, policies and regulations (SOX, PHI, and PII etc ). This role will collaborate with teams spanning multiple business units Offshore: All are on rotational basis: M-F rotation among two shifts 9 hours each between (6:30 AM IST- 3:30 PM IST or 12:30 – end 9:30 PM IST: Work from office India). Sat-Sun & US Eastern Holiday : rotation among 3 shifts (6:30 AM IST- 3:30 PM IST or 12:30 – end 9:30 PM IST or 9:15 PM IST-6:30 AM IST): Work from office India Responsibilities Troubleshooting batch events in technologies listed below Perform analysis & code deployment of recurring problems, provide recommendations and work with outside teams to identify/implement solutions Perform analysis and provide recommendations to enhance support processes such as documentation and/or monitoring . E xperience in Production support/maintenance in an onshore-offshore model environment Must have and proven strong Analytical, positive attitude towards analyzing, recovery and fix an issue Must have hands-on development experience and can deep dive into issues in the technologies related to: Unix/Linux shell scripting, PL/SQL, Informatica 10.4 (Power Center), Additionally: SQL Developer/Toad, Putty, Control M Scheduling , Teradata SQL Assistant, Teradata Viewpoint, SAP etc Release/Deployment, Operational Readiness, Production Governance is a must . Assure quality, security and compliance requirements are met for supported areas. Must be a team player and this position will work closely with other vendors and internal technical partners Analyze performance trends and recommend process improvements to ensure SLA's are met. Must be a self-starter, quick learner/adapter of new/required technologies to meet the project requirements to support day-2-day support activities Ability to understand complex problems, identify root causes and remain goal-oriented within a dynamic environment Working experience in Retail/Pharmacy area is a plus Willing to learn new technologies as needed Proven high performer, demonstrated by consistent high-performance reviews, exceptional customer service management Basic understanding of application development/architecture, change management, incident and problem management is a plus. No Remote/work from Home preferred Qualifications we seek in you! Minimum Qualifications /skills Bachelor's degree or foreign equivalent in Computer Science, Information Systems, Engineering, or related field is required. Working experience in Retail/Pharmacy area is a plus Willing to learn new technologies as needed Preferred qualifications /skills Computer competency and literacy in Microsoft Windows applications especially with Excel, Word, PowerPoint , Outlook is a must. Strong interpersonal and communication skills, including the ability to interact at all levels of the organization in person and over email is a plus.( Senior Leadership, Business Leadership, Technical resources, 3 rd party vendors) Commitment to shift is a must Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com Follow us on X, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 27, 2025, 5:11:24 AM Unposting Date Ongoing Master Skills List Consulting Job Category Full Time
Posted 1 month ago
7.0 years
0 Lacs
Noida
On-site
Req ID: 323472 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Microstrategy Developer to join our team in NOida, Uttar Pradesh (IN-UP), India (IN). BI Sr. Developer BI Sr. Developer is responsible for solution delivery which includes understanding business requirements, analysis, design, development, documentation, training, and deployment of BI applications using tool like MicroStrategy and Alteryx. Job Responsibilities include: Participate in requirement gathering sessions. Design and development of reports and dashboards using MicroStrategy and Alteryx. Data analysis, including data collection, synthesis, and translation of results into concrete actionable solutions using Alteryx. Apply advanced skills, knowledge, and experience to design and develop efficient Alteryx workflows to meet customer needs. Maintain current BI processes and effectively address production issues/defects encountered. As part of unit and regression testing, validate reports and provide a detailed analysis of data. Participate in technology governance groups that defines policies, best practices and make design decisions in on-going projects. Mentoring junior developers regarding best practices and technology stacks used to build the application. Work in collaborative environments that follow agile project management methodologies like XP and Scrum. Work closely with BI and EDW system administrators for code migrations and production support. Debug data quality issues by analyzing the upstream sources and provide guidance on resolutions. Closely work with DBAs to fix performance bottlenecks. Find and implement innovation and optimization opportunities including automations. Provide Level 2 and Level 3 production support for various BI applications. Required Skills: 7+ years of experience in designing and developing business intelligence dashboards using MicroStrategy. 5+ years of experience developing production grade Alteryx workflows. Ability to write, debug and optimize moderately complex SQL queries. Working experience in Teradata, Oracle including performance tuning and debugging performance bottlenecks. Conceptual understanding of logical and physical data model is a must, and a working experience is a plus. Ability to work in a 24 x 7 setting as either on-call or escalation contact. Excellent written and verbal communication skills with the ability to interact with senior business and technical IT management. Strong Analytical and problem-solving abilities. Nice to have skills: MicroStrategy certifications Alteryx certifications SAS Tableau About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 1 month ago
0 years
0 Lacs
Noida
On-site
Req ID: 328474 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Informatica, PL/SQL, Power center, IICS to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description: About Us* At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities, and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We are devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services* Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* Global Finance was set up in 2007 as a part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business and Enterprise Finance functions. The capabilities hosted include Legal Entity Controllership, General Accounting & Reconciliations, Regulatory Reporting, Operational Risk and Control Oversight, Finance Systems Support, etc. Our group supports the Finance function for the Consumer Banking, Wealth and Investment Management business teams across Financial Planning and Analysis, period end close, management reporting and data analytics. Job Description* Candidate will be responsible for developing & validating dashboards and business reports using Emerging Technology tools like Tableau, Alteryx, etc. In addition, SQL experience is required for data management. The candidate will be responsible for delivering complex and time critical data mining and analytical projects for the Secured Lending product and in addition will be responsible for analysis of data for decision making by senior leadership. Candidate will be responsible for data management, data extraction and upload, data validation, scheduling & process automation, report preparation, etc. The individual will play a key role in the team responsible for financial data reporting, adhoc reporting & data requirements, data analytics & business analysis and would manage multiple projects in parallel by ensuring adequate understanding of the requirements and deliver data driven insights and solutions to complex business problems. These projects would be time critical which would require the candidate to comprehend & evaluate the strategic business drivers to bring in efficiencies through automation of existing reporting packages or codes. The work would be a mix of standard and ad-hoc deliverables based on dynamic business requirements. Technical competency is essential to build processes which ensure data quality and completeness across all projects / requests related to business. The core responsibility of this individual is process management to achieve sustainable, accurate and well controlled results. Candidate should have a clear understanding of the end-to-end process (including its purpose) and a discipline of managing and continuously improving those processes. Responsibilities* Preparation and maintenance of various KPI reporting for Secured Lending business including performing data or business driven deep dive analysis. Understand business requirements and translate those into deliverables. Support the business on periodic and ad-hoc projects related to Secured Lending products. Develop and maintain codes for the data extraction, manipulation, and summarization on tools such as SQL and Emerging technologies like Tableau and Alteryx. Design solutions, generate actionable insights, optimize existing processes, build tool-based automations, and ensure overall program governance. Managing and improve the work: develop full understanding of the work processes, continuous focus on process improvement through simplification, innovation, and use of emerging technology tools, and understanding data sourcing and transformation. Managing risk: managing and reducing risk, proactively identify risks, issues and concerns and manage controls to help drive responsible growth (ex: Compliance, procedures, data management, etc.), establish a risk culture to encourage early escalation and self-identifying issues. Effective communication: deliver transparent, concise, and consistent messaging while influencing change in the teams. Extremely good with numbers and ability to present various business/finance metrics, detailed analysis, and key observations to Senior Business Leaders. Requirements* Education* - Masters/Bachelor’s Degree in Information Technology/Computer Science/ MCA with 10 plus years of relevant work experience MBA would be a preferred qualification Experience Range* 10 plus years of relevant work experience in data analytics & reporting, business analysis & financial reporting in banking industry. Exposure to Consumer banking businesses would be an added advantage. Experience around Secured Lending reporting & analytics would be preferable. Foundational skills* Strong abilities in data extraction, data manipulation and business analysis and strong financial acumen. Strong computer skills, including MS excel, Teradata SQL, and emerging technologies like Alteryx, Tableau. Prior Banking and Financial services industry experience, preferably Retail banking/Wealth management. Strong business problem solving skills, and ability to deliver on analytics projects independently, from initial structuring to final presentation. Strong communication skills (both verbal and written), Interpersonal skills and relationship management skills to navigate the complexities of aligning stakeholders, building consensus, and resolving conflicts. Querying data from multiple source Experience in data extraction, transformation & loading using SQL including trouble shooting Proven ability to manage multiple and often competing priorities in a global environment. Manages operational risk by building strong processes and quality control routines. Data Quality and Governance: Ability to clean, validate and ensure data accuracy and integrity. Desired skills* Ability to effectively manage multiple priorities under pressure and deliver as well as being able to adapt to changes. Able to work in a fast paced, deadline-oriented environment. Multiple stakeholder management Attention to details: Strong focus on data accuracy and documentation. Work Timings* 1.30 pm to 10.30 pm (will require to stretch 7-8 days in a month to meet critical deadlines) Job Location* Mumbai
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough