Jobs
Interviews

1357 Teradata Jobs - Page 50

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Greater Kolkata Area

On-site

Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-makin g. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements AI & Data Testing Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. An ETL Tester is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems They work closely with ETL developers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA Require experienced ETL testers (Informatica Power center) with an experience of 2-5 yrs. and having below skills: Required Skills Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle, SQL Server, Teradata and Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in ETL Automation and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300067 Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-makin g. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements AI & Data Testing Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. An ETL Tester is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems They work closely with ETL developers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA Require experienced ETL testers (Informatica Power center) with an experience of 2-5 yrs. and having below skills: Required Skills Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle, SQL Server, Teradata and Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in ETL Automation and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300070 Show more Show less

Posted 2 months ago

Apply

4.0 - 6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Foreword: At iGreenTree, we're passionate about empowering energy and utility providers with innovative IT solutions. With deep domain knowledge and a dedication to innovation, we help our clients stay ahead of the curve in a rapidly changing industry. Whether you need IT consulting, application development, system integration, or digital transformation services, our team of experts has the expertise to deliver the right solution for your business. Partner with iGreenTree to unleash the power of technology and achieve sustainable growth in today's dynamic landscape. Who We Are Looking For: An ideal candidate who must demonstrate in-depth knowledge and understanding of RDBMS concepts and experienced in writing complex queries and data integration processes in SQL/TSQL and NoSQL. This individual will be responsible for helping the design, development and implementation of new and existing applications. Roles and Responsibilities: Reviews existing database designs and data management procedures and provides recommendations for improvement. Responsible for providing subject matter expertise in design of database schemes and performing data modeling (logical and physical models), for product feature enhancements as well as extending analytical capabilities. Develop technical documentation as needed. Architect, develop, validate and communicate Business Intelligence (BI) solutions like dashboards, reports, KPIs, instrumentation, and alert tools. Define data architecture requirements for cross-product integration within and across cloud-based platforms. Analyze, architect, develop, validate and support integrating data into SaaS platforms (like ERP, CRM, etc.) from external data source; Files (XML, CSV, XLS, etc.), APIs (REST, SOAP), RDBMS. Perform thorough analysis of complex data and recommend actionable strategies. Effectively translate data modeling and BI requirements into the design process. Big Data platform design i.e. tool selection, data integration, and data preparation for predictive modeling. Required Skills: Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. 2-3 years of experience in Extraction, Transformation and Loading ETL work using data migration tools like Talend, Informatica, DataStage, etc. 4-6 years of experience as a database developer in Oracle, MS SQL or other enterprise database with focus on building data integration processes. Candidate should have any NoSQL technology exposure preferably MongoDB. Experience in processing large data volumes indicated by experience with Big Data platforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.) Understanding data warehousing concepts and decision support systems. Ability to deal with sensitive and confidential material and adhere to worldwide data security and Experience writing documentation for design and feature requirements. Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure, etc. Excellent communication and collaboration skills. Show more Show less

Posted 2 months ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-makin g. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements AI & Data Testing Consultant The position is suited for individuals who have demonstrated ability to work effectively in a fast paced, high volume, deadline driven environment. An ETL Tester is responsible for testing and validating the accuracy and completeness of data being extracted, transformed, and loaded (ETL) from various sources into the target systems They work closely with ETL developers, data analysts, and other stakeholders to ensure the quality of data and the reliability of the ETL processes Education And Experience Education: B.Tech/M.Tech/MCA/MS/MBA Require experienced ETL testers (Informatica Power center) with an experience of 2-5 yrs. and having below skills: Required Skills Strong in Data warehouse testing - ETL and BI Strong Database Knowledge – Oracle, SQL Server, Teradata and Snowflake Strong SQL skills with experience in writing complex data validation SQL’s Experience working in Agile environment Experience creating test strategy, release level test plan and test cases Develop and Maintain test data for ETL testing Design and Execute test cases for ETL processes and data integration Good Knowledge of Rally, Jira and HP ALM Experience in ETL Automation and data validation using Python Document test results and communicate with stakeholders on the status of ETL testing Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300070 Show more Show less

Posted 2 months ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview Data Analyst will be responsible to partner closely with business and S&T teams in preparing final analysis reports for the stakeholders enabling them to make important decisions based on various facts and trends and lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. This role will be interacting with DG, DPM, EA, DE, EDF, PO and D &Ai teams for historical data requirement and sourcing the data for Mosaic AI program to scale solution to new markets. Responsibilities Lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. Partners with FP&A Product Owner and associated business SME’s to understand & document business requirements and associated needs Performs the analysis of business data requirements and translates into a data design that satisfies local, sector and global requirements Using automated tools to extract data from primary and secondary sources. Using statistical tools to identify, analyse, and interpret patterns and trends in complex data sets could be helpful for the diagnosis and prediction. Working with engineers, and business teams to identify process improvement opportunities, propose system modifications. Proactively identifies impediments and looks for pragmatic and constructive solutions to mitigate risk. Be a champion for continuous improvement and drive efficiency. Preference will be given to candidate having functional understanding of financial concepts (P&L, Balance Sheet, Cash Flow, Operating Expense) and has experience modelling data & designing data flows Qualifications Bachelor of Technology from a reputed college Minimum 8-10 years of relevant work experience on data modelling / analytics, preferably Minimum 5-6year experience of navigating data in Azure Databricks, Synapse, Teradata or similar database technologies Expertise in Azure (Databricks, Data Factory, Date Lake Store Gen2) Proficient in SQL, Pyspark to analyse data for both development validation and operational support is critical Exposure to GenAI Good Communication & Presentation skill is must for this role. Show more Show less

Posted 2 months ago

Apply

2.0 - 4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description At United, we care about our customers. To be the best airline in aviation history, we need to deliver the best service to our customers. And it takes a whole team of dedicated customer-focused advocates to make it happen! From our Contact Center to customer analytics, insights, innovation, and everything in between, the Customer Experience team delivers outstanding service and helps us to run a more customer-centric and dependable airline. Job Overview And Responsibilities The team is currently looking for a well-rounded individual who has a passion for data and analytics. The role requires supporting the team by gathering data, conducting analyses, verifying reports and assist in ad-hoc decision support. Excellent time management, strong analytical capabilities and communication skills are keys to success in this role. Extract and analyze data from relational databases and reporting systems Proactively identify problems and opportunities and perform root cause analysis/diagnosis leading to business impact Identify issues, create hypotheses, and translate data into meaningful insights; present recommendations to key decision makers Responsible for the development and maintenance of reports, analyses, dashboards to drive key business decisions Build predictive models and analyze results for dissemination of insights to leadership of the numerous operations groups at United Prepare presentations that summarize data and help facilitate decision-making for business partners and senior leadership team This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's Degree required 2-4 years of analytics-related experience Must be proficient in Microsoft Excel and PowerPoint Must be competent in querying and manipulating relational databases via Teradata SQL assistant, Microsoft SQL server, Oracle SQL Developer Must be proficient in at least one quantitative analysis tool – Python / R Must be familiar with one or more reporting tools – Tableau / Oracle OBIEE / TIBCO Spotfire Must be detail-oriented, thorough and analytical with a desire for continuous improvements Must be adept at juggling several projects and initiatives simultaneously through appropriate prioritization Exhibit written and spoken English fluency Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): MBA preferred GGN00002020 Show more Show less

Posted 2 months ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Overview Data Analyst will be responsible to partner closely with business and S&T teams in preparing final analysis reports for the stakeholders enabling them to make important decisions based on various facts and trends and lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. This role will be interacting with DG, DPM, EA, DE, EDF, PO and D &Ai teams for historical data requirement and sourcing the data for Mosaic AI program to scale solution to new markets. Responsibilities Lead data requirement, source analysis, data analysis, data transformation and reconciliation activities. Partners with FP&A Product Owner and associated business SME's to understand & document business requirements and associated needs Performs the analysis of business data requirements and translates into a data design that satisfies local, sector and global requirements Using automated tools to extract data from primary and secondary sources. Using statistical tools to identify, analyse, and interpret patterns and trends in complex data sets could be helpful for the diagnosis and prediction. Working with engineers, and business teams to identify process improvement opportunities, propose system modifications. Proactively identifies impediments and looks for pragmatic and constructive solutions to mitigate risk. Be a champion for continuous improvement and drive efficiency. Preference will be given to candidate having functional understanding of financial concepts (P&L, Balance Sheet, Cash Flow, Operating Expense) and has experience modelling data & designing data flows Qualifications Bachelor of Technology from a reputed college Minimum 8-10 years of relevant work experience on data modelling / analytics, preferably Minimum 5-6year experience of navigating data in Azure Databricks, Synapse, Teradata or similar database technologies Expertise in Azure (Databricks, Data Factory, Date Lake Store Gen2) Proficient in SQL, Pyspark to analyse data for both development validation and operational support is critical Exposure to GenAI Good Communication & Presentation skill is must for this role.

Posted 2 months ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About This Role Wells Fargo is seeking a Financial Crimes Associate Manager In This Role, You Will Supervise entry to mid level roles in transactional or tactical less complex tasks and processes to ensure timely completion, quality and compliance Manage the implementation of procedures, controls, analytics and trend analysis to ensure identification, prevention execution, detection, investigation, recovery, government and internal reporting of financial crime activity Maintain awareness of financial crimes activity companywide and ensure all issues are proactively addressed, and escalated where necessary Ensure compliance with regulatory requirements such as Bank Secrecy Act, USA PATRIOT Act, and FATCA Identify opportunities for process improvement and risk control development in less complex functions Manage a risk based financial crimes program or functional area with low to moderate risk and complexity Lead implementation of multiple complex initiatives with low to moderate risk Make supervisory and tactical decisions and resolve issues related to team supervision, work allocation and daily operations under direction of functional area management Leverage interpretation of policies, procedures, and compliance requirements Collaborate and consult with peers, colleagues and managers Ensure coordination with team, line of business, other business units, Audit, and regulators on risk related topics Manage allocation of people and financial resources for Financial Crimes Mentor and guide talent development of direct reports and assist in hiring talent Required Qualifications: 2+ years of Financial Crimes, Operational Risk, Fraud, Sanctions, Anti-Bribery, Corruption experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education 1+ years of Leadership experience Desired Qualifications: Hands on experience as a people manager in financial institution managing team of fin crime data quality , model development as well as financial reporting analysts Experience with managing procedures and controls for teams that support data analytics, data platforms, and reporting Demonstrated experience working in Anti-Money Laundering (AML) programs and/or data platform management for large financial institutions Experience mentoring and guiding guide talent development of direct reports and assist in hiring talent Hands on experience with handling BSA/AML/OFAC laws and regulation specific and financial crimes /regulatory/fraud specific data Knowledge of fin crime data quality activities and controls required for large financial institutions. Demonstrated experience with report and dashboard creations using large data sets, including non-standard data is desired but not mandatory Team handling experience for UAT/regression testing on data outputs involving complex data mapping designs Hands on experience as a people manager leading a team of 15 + data analysts who are responsible for conducting data quality analysis to support financial crime data modelling Prior Experience enhancing AML Monitoring Models and Systems including Oracle/Actimize using tools like Advance SQL/SAS/Python Manage team of technical analysts working on AML technology leveraging SAS/SQL/Python/Teradata and technical data validation tools and relevant AML technologies including Norkom, Actimize, Oracle FCCM etc.to support technical project deliveries Support AML technology initiatives during new AML product implementation as well as during technology migrations for Transactions Monitoring and Fraud Detection Handle large technology transformation programs with phased delivery of technical deliverables /features of a Transactions Monitoring and Fraud Detection system and associated data validations/transformation logics as well as MIS reporting using Power BI/Tableau Manage team to deliver AML/BSA technology project deliveries including AML model developments, Transactions Monitoring Model validations and enhancements, Critical Data Elements identification, Data quality/data validation, Threshold testing , MIS Reporting using Tableau/Power BI as well as AI/ML based AML technology developments and testing Manage the implementation of procedures, controls, analytics and trend analysis to ensure identification, prevention execution, detection, investigation, recovery, government and internal reporting of financial crime activity Maintain awareness of financial crimes activity companywide and ensure all issues are proactively addressed, and escalated where necessary Ensure compliance with regulatory requirements such as Bank Secrecy Act, USA PATRIOT Act, and FATCA Posting End Date: 28 May 2025 Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-448974 Show more Show less

Posted 2 months ago

Apply

5.0 - 14.0 years

0 Lacs

Kochi, Kerala, India

On-site

Skill: - Abinitio Experience: 5 to 14 years Location: - Kochi (Walkin on 14th June) Responsibilities Ab Initio Skills: Graph Development, Ab Initio standard environment parameters, GD(PDL,MFS Concepts)E, EME basics, SDLC, Data Analysis Database: SQL Proficient, DB Load / Unload Utilities expert, relevant experience in Oracle, DB2, Teradata (Preferred) UNIX:Shell Scripting (must), Unix utilities like sed, awk, perl, python Scheduling knowledge (Control M, Autosys, Maestro, TWS, ESP) Project Profiles:Atleast 2-3 Source Systems, Multiple Targets, simple business transformations with daily, monthly Expected to produce LLD, work with testers, work with PMO and develop graphs, schedules, 3rd level support Should have hands on development experience with various Ab Initio components such as Rollup Scan, join Partition, by key Partition, by Round Robin. Gather, Merge, Interleave Lookup etc Experience in finance and ideally capital markets products. Requires experience in development and support of complex frameworks to handle multiple data ingestion patterns.e.g, messaging files,hierarchical polymorphic xml structures conformance of data to a canonical model curation and distribution of data QA Resource. Data modeling experience creating CDMs LDMs PDMs using tools like ERWIN, Power designer or MagicDraw. Detailed knowledge of the capital markets including derivatives products IRS CDS Options structured products and Fixed Income products. Knowledge on Jenkins and CICD concepts. Knowledge on scheduling tool like Autosys and Control Center. Demonstrated understanding of how AbInitio applications and systems interact with the underlying hardware ecosystem. Experience working in an agile project development lifecycle. Strong in depth knowledge of databases and database concepts DB2 knowledge is a plus Show more Show less

Posted 2 months ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

Hyderabad

Work from Office

Overview This role serves as an Associate Analyst for the GTM Data analytics COE project development team. This role is one of the go-to resource for building/ maintaining key reports, data pipelines and advanced analytics necessary to bring insights to light for senior leaders and Sector and field end users. Responsibilities The COEs core competencies are a mastery of data visualization, data engineering, data transformation, predictive and prescriptive analytics Enhance data discovery, processes, testing, and data acquisition from multiple platforms. Apply detailed knowledge of PepsiCos applications for root-cause problem-solving. Ensure compliance with PepsiCo IT governance rules and design best practices. Participate in project planning with stakeholders to analyze business opportunities and define end-to-end processes. Translate operational requirements into actionable data presentations. Support data recovery and integrity issue resolution between business and PepsiCo IT. Provide performance reporting for the GTM function, including ad-hoc requests using internal, shipment data systems Develop on-demand reports and scorecards for improved agility and visualization. Collate and analyze large data sets to extract meaningful insights on performance trends and opportunities. Present insights and recommendations to the GTM Leadership team regularly. Manage expectations through effective communication with headquarters partners. Ensure timely and accurate data delivery per service level agreements (SLA). Collaborate across functions to gather insights for action-oriented analysis. Identify and act on opportunities to improve work delivery. Implement process improvements, reporting standardization, and optimal technology use. Foster an inclusive and collaborative environment. Provide baseline support for monitoring SPA mailboxes, work intake, and other ad-hoc requests.Additionally, the role will provide baseline support for monitoring work intake & other adhoc requests, queries Qualifications Undergrad degree in Business or related technology 3-4 Yrs working experience in Power BI 1-2 Yrs working experience in SQL and Python Preferred qualifications : Information technology or analytics experience is a plus Familiarity with Power BI/ Tableau, Python, SQL, Teradata, Azure, MS Fabric Requires a level of analytical, critical thinking, and problem-solving skills as well as great attention to detail Strong time management skills, ability to multitask, set priorities, and plan

Posted 2 months ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling - documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Show more Show less

Posted 2 months ago

Apply

15.0 years

0 Lacs

India

Remote

Job Title: Data Engineer Lead - AEP Location:Remote Experience Required: 12–15 years overall experience 8+ years in Data Engineering 5+ years leading Data Engineering teams Cloud migration & consulting experience (GCP preferred) Job Summary: We are seeking a highly experienced and strategic Lead Data Engineer with a strong background in leading data engineering teams, modernizing data platforms, and migrating ETL pipelines and data warehouses to Google Cloud Platform (GCP) . You will work directly with enterprise clients, architecting scalable data solutions, and ensuring successful delivery in high-impact environments. Key Responsibilities: Lead end-to-end data engineering projects including cloud migration of legacy ETL pipelines and Data Warehouses to GCP (BigQuery) . Design and implement modern ELT/ETL architectures using Dataform , Dataplex , and other GCP-native services. Provide strategic consulting to clients on data platform modernization, governance, and data quality frameworks. Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders. Define and enforce data engineering best practices , coding standards, and CI/CD processes. Mentor and manage a team of data engineers; foster a high-performance, collaborative team culture. Monitor project progress, ensure delivery timelines, and manage client expectations. Engage in technical pre-sales and solutioning , driving excellence in consulting delivery. Technical Skills & Tools: Cloud Platforms: Strong experience with Google Cloud Platform (GCP) – particularly BigQuery , Dataform , Dataplex , Cloud Composer , Cloud Storage , Pub/Sub . ETL/ELT Tools: Apache Airflow, Dataform, dbt (if applicable). Languages: Python, SQL, Shell scripting. Data Warehousing: BigQuery, Snowflake (optional), traditional DWs (e.g., Teradata, Oracle). DevOps: Git, CI/CD pipelines, Docker. Data Modeling: Dimensional modeling, Data Vault, star/snowflake schemas. Data Governance & Lineage: Dataplex, Collibra, or equivalent tools. Monitoring & Logging: Stackdriver, DataDog, or similar. Preferred Qualifications: Proven consulting experience with premium clients or Tier 1 consulting firms. Hands-on experience leading large-scale cloud migration projects . GCP Certification(s) (e.g., Professional Data Engineer, Cloud Architect). Strong client communication, stakeholder management, and leadership skills. Experience with agile methodologies and project management tools like JIRA. Show more Show less

Posted 2 months ago

Apply

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Description The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program Manager Role And Responsibilities Represent eClerx in client pitches and external forums. Own platform and expertise through various COE activities and content generation to promote practice and business development. Lead continuous research and assessments to explore best and latest platforms, approaches, and methodologies. Contribute to developing the practice area through best practices, ideas, and Point of Views in the form of white papers and micro articles. Lead/partner in multi-discipline assessments and workshops at client sites to identify new opportunities. Lead key projects and provide development/technical leadership to junior resources. Drive solution design and build to ensure scalability, performance, and reuse. Design robust data architectures, considering performance, data quality, scalability, and data latency requirements. Recommend and drive consensus around preferred data integration and platform approaches, including Azure and Snowflake. Anticipate data bottlenecks (latency, quality, speed) and recommend appropriate remediation strategies. This is a hands-on position with a significant development component, and the ideal candidate is expected to lead the technical development and delivery of highly visible and strategic projects. Technical And Functional Skills Bachelor's Degree with at least 2-3 large-scale Cloud implementations within Retail, Manufacturing, or Technology industries. 10+ years of overall experience with data management and cloud engineering. Expertise in Azure Cloud, Azure Data Lake, Databricks, Snowflake, Teradata, and compatible ETL technologies. Strong attention to detail and ability to collaborate with multiple parties, including analysts, data subject matter experts, external labs, etc. About Us At eClerx, we serve some of the largest global companies – 50 of the Fortune 500 clients. Our clients call upon us to solve their most complex problems, and deliver transformative insights. Across roles and levels, you get the opportunity to build expertise, challenge the status quo, think bolder, and help our clients seize value About The Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law. Show more Show less

Posted 2 months ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Essential Qualifications: Bachelor of Engineering / Technology preferably IT, Computer Engg or Electronics & Communications, or ME / MTech / MS. MBA from top institute Preferred Job Overview: PGP Glass Pvt Ltd looking for a Head Analytics to join our growing team of data analytics experts and manage the processes and people responsible for accurate data collection, processing, modeling and analysis. The ideal candidate has a knack for seeing solutions in sprawling data sets and the business mindset to convert insights into strategic opportunities for our company. The Head Analytics will work closely with leaders across Product (Engineering, Manufacturing/ Integrated Supply Chain), Finance, Sales, and Marketing to support and implement high-quality, data-driven decisions. Will ensure data accuracy and consistent reporting by designing and creating optimal processes and procedures for analytics employees to follow. They will use advanced data modeling, predictive modeling and analytical techniques to interpret key findings from company data and leverage these insights into initiatives that will support business outcomes. The right person for the job will apply their exhaustive knowledge of data analysis to solving real-world problems faced by our company and finding opportunities for improvement across multiple projects, teams, business units and external partners/vendors. Responsibilities for Head Analytics Lead cross-functional projects using advanced data modeling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities Build, develop and maintain data models, reporting systems, data automation systems, dashboards and performance metrics support that support key business decisions Design and build technical processes to address business issues Oversee the design and delivery of reports and insights that analyze business functions and key operations and performance metrics Recruit, train, develop and supervise analyst-level employees Ensure accuracy of data and deliverables of reporting employees with comprehensive policies and processes Manage and optimize processes for data intake, validation, mining and engineering as well as modeling, visualization and communication deliverables Examine, interpret and report results of analytical initiatives to stakeholders in leadership, technology, sales, marketing and product teams Oversee the data/report requests process: tracking requests submitted, prioritization, approval, etc. Develop and implement quality controls and departmental standards to ensure quality standards, organizational expectations, and regulatory requirements Anticipate future demands of initiatives related to people, technology, budget and business within your department and design/implement solutions to meet these needs Organize and drive successful completion of data insight initiatives through effective management of analyst and data employees and effective collaboration with stakeholders Communicate results and business impacts of insight initiatives to stakeholders within and outside of the company Travel: Required to travel to sites as per business needs Qualifications for Head Analytics Working knowledge of data mining principles: predictive analytics, mapping, collecting data from multiple data systems on premises and cloud-based data sources Strong SQL skills, ability to perform effective querying involving multiple tables and subqueries Understanding of and experience using analytical concepts and statistical techniques: hypothesis development, designing tests/experiments, analyzing data, drawing conclusions, and developing actionable recommendations for business units Experience and knowledge of statistical modeling techniques: GLM multiple regression, logistic regression, log-linear regression, variable selection, etc. Experience writing advanced SAS code statements, models, and macros Experience working with and creating databases and dashboards using all relevant data to inform decisions Experience using analytics techniques to contribute to company growth efforts, increasing revenue and other key business outcomes Strong problem solving, quantitative and analytical abilities Strong ability to plan and manage numerous processes, people and projects simultaneously Excellent communication, collaboration and delegation skills We’re looking for someone with at least 5 years of experience in a position monitoring, managing, manipulating and drawing insights from data, and someone with at least 3 years of experience leading a team. The right candidate will also be proficient and experienced with the following tools/programs: Strong programming skills with querying languages: SLQ, SAS, etc. Experience with big data tools: Teradata, Aster, Hadoop, Spark etc. Experience with data visualization tools: Tableau, Raw, chart.js, etc. Experience with Adobe Analytics and other analytics tools including Experience with data sets and databases of structured and unstructured characteristics Python, C, C++, JAVA, or other programming languages Experience with Excel, Word, and PowerPoint, Cloud Analytical platforms – AWS Redshift /Azure Synapse / Azure Stream Analytics Show more Show less

Posted 2 months ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 2 months ago

Apply

12.0 years

0 Lacs

Delhi, India

On-site

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 2 months ago

Apply

12.0 years

0 Lacs

Greater Kolkata Area

On-site

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 2 months ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 2 months ago

Apply

12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 2 months ago

Apply

12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Expereince: 12+ Years Location: Pune & Hyderabad Notice Period- Immediate to 30 days Your responsibilities will include: Defining the right migration architectures for demise First Direct legacy systems and migrate to the interim or target strategic solutions. Assuring the interim solutions and/or target solutions are fit for purpose, cost effective, and align with bank standards. Gaining formal approval for any interim solutions through the appropriate governance, documenting the architecture, associated risks and technical debt as necessary. Effectively communicating the interim/strategic solution architectures to business and technology colleagues responsible for materialising the design. Working with Enterprise/Domain Architecture colleagues to identify the interim/target solution to support the legacy system demise demand. Collaborate with solution analyst and legacy subject matter expert to work out the cost required for system demise and uplift. The ideal candidate for this role will have the below experience and qualifications: Proven experience as a solution architect or similar role The ability to communicate with a wide range of IT specialists, suppliers and business stakeholders The capability to drive, govern and support establishing architecture functions The capability to develop the documentation of the system by studying the source code Strong understanding of architecture and IT governance frameworks In-depth understanding on the following technology (DB2, Teradata, BigQuery on GCP and IBM zOS Mainframe) Show more Show less

Posted 2 months ago

Apply

5.5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Title: Associate KEY SKILLS - Mainframe, Teradata Datastage Mainframe and Teradata DataStage Associate Summary: . Minimum Degree Required : Bachelor's degree in computer science/IT or relevant field Degree Preferred : Master’s degree in computer science/IT or relevant field Minimum Years of Experience : 2 – 5.5 year(s) Certifications Required : NA Job Summary: We are seeking a skilled and experienced IT professional to join our team as a Mainframe and Teradata DataStage Associate. The successful candidate will be responsible for developing, maintaining, and optimizing ETL processes using IBM DataStage, as well as managing and supporting data operations on Mainframe and Teradata platforms. Key Responsibilities: Design, develop, and implement ETL processes using IBM DataStage to support data integration and transformation requirements. Manage and maintain data on Mainframe and Teradata systems, ensuring data integrity and performance optimization. Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications. Troubleshoot and resolve issues related to ETL processes and data management on Mainframe and Teradata platforms. Monitor and tune the performance of ETL jobs and database queries to ensure optimal performance. Develop and maintain documentation related to ETL processes, data flows, and system configurations. Participate in code reviews and ensure adherence to best practices and coding standards. Provide support for data migration and integration projects, ensuring timely and accurate data delivery. Stay updated with the latest developments in Mainframe, Teradata, and DataStage technologies and recommend improvements. Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. years of experience working with Mainframe systems, including experience with COBOL, JCL, and VSAM. years of experience with Teradata, including SQL development and performance tuning. Strong proficiency in IBM DataStage, with experience designing and implementing complex ETL processes. Solid understanding of data warehousing concepts and database design principles. Excellent problem-solving skills and the ability to work under pressure in a fast-paced environment. Strong communication and collaboration skills, with the ability to work effectively in cross-functional teams. Experience with version control systems and agile development practices is a plus. Preferred Qualifications: Experience with additional ETL tools and data integration technologies. Knowledge of other database systems such as Oracle, SQL Server, or DB2. Experience with cloud data solutions and platforms. Show more Show less

Posted 2 months ago

Apply

0.0 - 4.0 years

0 Lacs

Gurugram, Haryana

On-site

Location Gurugram, Haryana, India Category Corporate Job Id GGN00002020 Procurement / Strategic Sourcing / Purchasing Job Type Full-Time Posted Date 05/26/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description At United, we care about our customers. To be the best airline in aviation history, we need to deliver the best service to our customers. And it takes a whole team of dedicated customer-focused advocates to make it happen! From our Contact Center to customer analytics, insights, innovation, and everything in between, the Customer Experience team delivers outstanding service and helps us to run a more customer-centric and dependable airline. Job overview and responsibilities The team is currently looking for a well-rounded individual who has a passion for data and analytics. The role requires supporting the team by gathering data, conducting analyses, verifying reports and assist in ad-hoc decision support. Excellent time management, strong analytical capabilities and communication skills are keys to success in this role. Extract and analyze data from relational databases and reporting systems Proactively identify problems and opportunities and perform root cause analysis/diagnosis leading to business impact Identify issues, create hypotheses, and translate data into meaningful insights; present recommendations to key decision makers Responsible for the development and maintenance of reports, analyses, dashboards to drive key business decisions Build predictive models and analyze results for dissemination of insights to leadership of the numerous operations groups at United Prepare presentations that summarize data and help facilitate decision-making for business partners and senior leadership team This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's Degree required 2-4 years of analytics-related experience Must be proficient in Microsoft Excel and PowerPoint Must be competent in querying and manipulating relational databases via Teradata SQL assistant, Microsoft SQL server, Oracle SQL Developer Must be proficient in at least one quantitative analysis tool – Python / R Must be familiar with one or more reporting tools – Tableau / Oracle OBIEE / TIBCO Spotfire Must be detail-oriented, thorough and analytical with a desire for continuous improvements Must be adept at juggling several projects and initiatives simultaneously through appropriate prioritization Exhibit written and spoken English fluency Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): MBA preferred

Posted 2 months ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Ab Initio Data Engineer We are looking for Ab Initio Data Engineer to be able to design and build Ab Initio-based applications across Data Integration, Governance & Quality domains for Compliance Risk programs. The individual will be working with both Technical Leads, Senior Solution Engineers and prospective Application Managers in order to build applications, rollout and support production environments, leveraging Ab Initio tech-stack, and ensuring the overall success of their programs. The programs have a high visibility, and are fast paced key initiatives, which generally aims towards acquiring & curating data and metadata across internal and external sources, provide analytical insights and integrate with other Citi systems. Technical Stack: Ab Initio 4.0.x software suite – Co>Op, GDE, EME, BRE, Conduct>It, Express>It, Metadata>Hub, Query>it, Control>Center, Easy>Graph Big Data – Cloudera Hadoop, Hive, Yarn Databases - Oracle 11G/12C, Teradata, MongoDB, Snowflake Others – JIRA, Service Now, Linux, SQL Developer, AutoSys, and Microsoft Office Responsibilities: Ability to design and build Ab Initio graphs (both continuous & batch) and Conduct>it Plans, and integrate with portfolio of Ab Initio softwares. Build Web-Service and RESTful graphs and create RAML or Swagger documentations. Complete understanding and analytical ability of Metadata Hub metamodel. Strong hands on Multifile system level programming, debugging and optimization skill. Hands on experience in developing complex ETL applications. Good knowledge of RDBMS – Oracle, with ability to write complex SQL needed to investigate and analyze data issues Strong in UNIX Shell/Perl Scripting. Build graphs interfacing with heterogeneous data sources – Oracle, Snowflake, Hadoop, Hive, AWS S3. Build application configurations for Express>It frameworks – Acquire>It, Spec-To-Graph, Data Quality Assessment. Build automation pipelines for Continuous Integration & Delivery (CI-CD), leveraging Testing Framework & JUnit modules, integrating with Jenkins, JIRA and/or Service Now. Build Query>It data sources for cataloguing data from different sources. Parse XML, JSON & YAML documents including hierarchical models. Build and implement data acquisition and transformation/curation requirements in a data lake or warehouse environment, and demonstrate experience in leveraging various Ab Initio components. Build Autosys or Control Center Jobs and Schedules for process orchestration Build BRE rulesets for reformat, rollup & validation usecases Build SQL scripts on database, performance tuning, relational model analysis and perform data migrations. Ability to identify performance bottlenecks in graphs, and optimize them. Ensure Ab Initio code base is appropriately engineered to maintain current functionality and development that adheres to performance optimization, interoperability standards and requirements, and compliance with client IT governance policies Build regression test cases, functional test cases and write user manuals for various projects Conduct bug fixing, code reviews, and unit, functional and integration testing Participate in the agile development process, and document and communicate issues and bugs relative to data standards Pair up with other data engineers to develop analytic applications leveraging Big Data technologies: Hadoop, NoSQL, and In-memory Data Grids Challenge and inspire team members to achieve business results in a fast paced and quickly changing environment Perform other duties and/or special projects as assigned Qualifications: Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 5 years of experience Minimum 5 years of extensive experience in design, build and deployment of Ab Initio-based applications Expertise in handling complex large-scale Data Lake and Warehouse environments Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less

Posted 2 months ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Years of Experience: Candidates with 4+ years of experience in developing and delivering scalable big data pipelines using Apache Spark and Databricks on AWS. ` Position Requirements Must Have : Build and maintain scalable data pipelines using Databricks and Apache Spark. Develop and optimize ETL/ELT processes for structured and unstructured data. Knowledge of Lakehouse architecture for efficient data storage, processing, and analytics. Orchestrating ETL/ELT Pipelines: Design and manage data workflows using Databricks Workflows, Jobs API. Worked with AWS Data Services (S3, Lambda, CloudWatch) for seamless integration. Performance Optimization: Optimize queries using pushdown capabilities and indexing strategies. Worked on data governance with Unity Catalog, security policies, and access controls. Monitor, troubleshoot, and improve Databricks jobs and clusters. Exposure to end-to-end implementation of migration projects to AWS Cloud AWS & Python Expertise with hands-on cloud development. Orchestration: Airflow Code Repositories: Git, GitHub. Strong in writing SQL Cloud Data Migration: Deep understanding of processes. Strong Analytical, Problem-Solving & Communication Skills. Good To Have Knowledge / Skills Experience in Teradata, DataStage , SSIS Knowledge of Databricks Delta Live Table. Knowledge of Delta Lake. Streaming: Kafka, Spark Streaming. CICD: Jenkins IaC & Automation: Terraform for Databricks deployment. Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA Show more Show less

Posted 2 months ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Years of Experience: Candidates with 8+ years of experience in architecting and delivering scalable big data pipelines using Apache Spark and Databricks on AWS. Position Requirements Must Have : Design, build, and maintain scalable data pipelines using Databricks and Apache Spark. Good knowledge on Medallion Architecture in Databricks Lakehouse Develop and optimize ETL/ELT processes for structured and unstructured data. Implement Lakehouse architecture for efficient data storage, processing, and analytics. Orchestrating ETL/ELT Pipelines: Design and manage data workflows using Databricks Workflows, Jobs API. Work with AWS Data Services (S3, Lambda, CloudWatch) for seamless integration. Performance Optimization: Optimize queries using pushdown capabilities and indexing strategies. Implement data governance with Unity Catalog, security policies, and access controls. Collaborate with data scientists, analysts, and engineers to enable advanced analytics. Monitor, troubleshoot, and improve Databricks jobs and clusters. Strong expertise in end-to-end implementation of migration projects to AWS Cloud Should be aware of Data Management concepts and Data Modelling AWS & Python Expertise with hands-on cloud development. Spark Performance Tuning: Core, SQL, and Streaming. Orchestration: Airflow Code Repositories: Git, GitHub. Strong in writing SQL Cloud Data Migration: Deep understanding of processes. Strong Analytical, Problem-Solving & Communication Skills. Good To Have Knowledge / Skills Experience in Teradata, DataStage , SSIS, Mainframe(Cobol, JCL, Zeke Scheduler) Knowledge on Lakehouse Federation Knowledge of Delta Lake. Knowledge of Databricks Delta Live Table. Streaming: Kafka, Spark Streaming. CICD : Jenkins IaC & Automation: Terraform for Databricks deployment. Knowlege on integrating 3party APIs to Databricks. Knowledge of Transport & Mobility domain. Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA Show more Show less

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies