Jobs
Interviews

1346 Teradata Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 13.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Domain: Investment Banking (Asset management, Wealth management, Capital market, Equity, Fixed income) Experience: 6-13 Years JD: 7-9 years of experience as a Data Analyst, with at least 5 years supporting Investment within the industry. Hands-on experience with Vertica/Teradata for querying, performance optimization, and large-scale data analysis. Advanced SQL skills: proficiency in Python is a strong plus. Proven ability to write detailed source-to-target mapping documents and collaborate with technical teams on data integration. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modeling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Investment Principles: Knowledge of different asset classes, investment strategies, and financial markets. Quantitative Finance: Understanding of financial modeling, risk management, and derivatives. Regulatory Framework: Awareness of relevant financial regulations and compliance requirements.

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Chennai

On-site

Job Description: About us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities, and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview Global Markets Technology & Operations provides end-to-end technology solutions for Markets business including Equity, Prime Brokerage, Interest Rates, Currencies, Commodities, Derivatives and Structured Products. Across all these products, solutions include architecture, design, development, change management, implementation and support using various enterprise technologies. In addition, GMT&O provides Sales, Electronic Trading, Trade Workflow, Pricing, and Market Risk, Middle office, Collateral Management, Credit Risk, Post-trade confirmation, Settlement and Client service processes for Trading, Capital Markets, and Wealth Management businesses. ERTF – CFO is responsible for the technology solutions and platforms that support Chief Financial Officer (CFO) Group, including Global Financial Control, Corporate Treasury, Financial Forecasting, Enterprise Cost Management, Investor Relations, and Line of Business Finance functions (BFO). Increased demand for integrated and streamlined Business Finance management solutions has resulted in a few initiatives. The initiatives span across Subledger Simplification. AML Detection Channel Platform(ADCP) application is an AML monitoring tool used by GFCC to identify any suspicious activity, like - Money laundering & Fraud which requires compliance review. AML Alert Reconciliation Process (ARL) is one other application used by GFCC which receives alerts from various Detection Channels (AML, Fraud, etc), removes alert noise, enriches alerts, obtains attributes, decides (rule based) if the alert meets criteria and transforms the alert into an event. Job Description: This job is responsible for developing and delivering complex requirements to accomplish business goals. Key responsibilities of the job include ensuring that software is developed to meet functional, non-functional and compliance requirements, and solutions are well designed with maintainability/ease of integration and testing built-in from the outset. Job expectations include a strong knowledge of development and testing practices common to the industry and design and architectural patterns. Responsibilities: Codes solutions and unit test to deliver a requirement/story per the defined acceptance criteria and compliance requirements Designs, develops, and modifies architecture components, application interfaces, and solution enablers while ensuring principal architecture integrity is maintained Mentors other software engineers and coach team on Continuous Integration and Continuous Development (CI-CD) practices and automating tool stack Executes story refinement, definition of requirements, and estimating work necessary to realize a story through the delivery lifecycle Performs spike/proof of concept as necessary to mitigate risk or implement new ideas Automates manual release activities Designs, develops, and maintains automated test suites (integration, regression, performance) Manager of Process & Data: Demonstrates and expects process knowledge, data driven decisions, simplicity and continuous improvement. Enterprise Advocate & Communicator: Delivers clear and concise messages that motivate, convey the “why” and connects contributions to business results. Risk Manager: Leads and encourages the identification, escalation and resolution of potential risks. People Manager & Coach: Knows and develops team members through coaching and feedback. Financial Steward: Manages expenses and demonstrates an owner’s mindset. Enterprise Talent Leader: Recruits, on-boards and develops talent, and supports talent mobility for career growth. Driver of Business Outcomes: Delivers results through effective team management, structure, and routines. Requirements Education- BE/ BTECH/ ME/ MTECH Certifications if any- NA Experience range- 4- 6 years Foundational Skills Ability to work in multi technology project PYSPARK, Teradata, SQL, Unix Shell scripting Excellent oral/written communication skills / Project Management Skills Desired Skills Organized and able to multi-task in a fast-paced environment Highly motivated, able to work independently, self-starter; and problem/solving/analytical Excellent interpersonal skills; positive attitude; team player; flexible Willingness to learn and adapt to changes Shift Timings- 11:00 am to 8:00 pm Location- Chennai

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job description: Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Teradata . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 week ago

Apply

0 years

6 - 9 Lacs

Calcutta

On-site

About EY GDS Global Delivery Services (GDS) is a dynamic and truly global delivery network. Across our six locations, we work with teams from all EY service lines, geographies and sectors, and play a vital role in the delivery of the EY growth strategy. We operate from six countries and sixteen cities: Argentina (Buenos Aires) China (Dalian) India (Bangalore, Chennai, Gurgaon, Hyderabad, Kochi, Kolkata, Mumbai, Noida, Trivandrum) Philippines (Manila) Poland (Warsaw and Wroclaw) UK (Manchester, Liverpool) Careers in EY Global Delivery Services Join a team of over 50,000 people, working across borders, to provide innovative and strategic business solutions to EY member firms around the world. Join one of our dynamic teams From accountants to coders, we offer a wide variety of fulfilling career opportunities that span all business disciplines Our Consulting practice provides differentiated focus on the key business themes to help our clients solve better questions around technology. Our vision is to be recognized as a leading provider of differentiated technology consulting services, harnessing new disruptive technology, alliances and attracting talented people to solve our clients' issues. It's an exciting time to join us and grow your career as a technology professional. A technology career is about far more than leading-edge innovations. It’s about the application of these technologies in the real world to make a real, meaningful impact. We are looking for highly motivated, articulate individuals who have the skills to the technology lifecycle and are passionate about designing innovative solutions to solve complex business problems. Your career in Consulting can span across these technology areas/ services lines: Digital Technologies: We are a globally integrated digital architecture and engineering team. Our mission is to deliver tailored, custom-built end to end solutions to our customers that are Digital, Cloud Native and Open Source. Our skills include Experience design, UI development, Design Thinking, Architecture & Design, Full stack development (.Net/ Java/ SharePoint/ Power Platform), Emerging Technologies like Block Chain, IoT, AR\VR, Drones, Cloud and DevSecOps. We use industrialized techniques, built on top of agile methods utilizing our global teams to deliver end to end solutions at best unit cost proposition. Testing Services: We are the yardstick of quality software product. We break something to make the product stronger and successful. We provide entire gamut of testing services including Busines / User acceptance testing. Hence this is a team with all round skills such as functional, technical and process. Data & Analytics: Data and Analytics is amongst the largest and most versatile practices within EY. Our sector and domain expertise combined with technical skills in data, cloud, advanced analytics and artificial intelligence differentiates us in the industry. Our talented team possesses cross-sector and cross-domain expertise and a wide array of skills in Information Management (IM), Business Intelligence (BI), Advance Analytics (AA) and Artificial Intelligence (AI) Oracle: We provide one-stop solution for end-to-end project implementation enabled by Oracle and IBM Products. We use proven methodologies, tools and accelerators to jumpstart and support large Risk and Finance Transformation. We develop solutions using various languages such as SQL or PL/ SQL, Java, Java Script, Python, IBM Maximo and other Oracle Utilities. We also provide consulting services for streamlining the current reporting process using various Enterprise Performance Management tools. SAP: By building on SAP’s S/4HANA digital core and cloud services, EY and SAP are working to help organizations leverage industry-leading technologies to improve operational performance. This collaboration helps drive digital transformation for our clients across areas including finance, human resources, supply chain and procurement. Our goal is to support clients as they initiate or undergo major transformation. Our capabilities span end-to-end solution implementation services from strategy and architecture to production deployment. EY supports clients in three main areas, Technology implementation support, Enterprise and Industry application implementation, Governance Risk Compliance (GRC) Technology. Banking and Capital Market Services: Banking and Capital Market Services companies are transforming their complex tax and finance functions with technologies such as AI and ML. With the right blend of core competencies, tax and finance personnel will shift to data, process and technology skills to service global clients on their Core Banking Platforms and support their business / digital transformation like Deposit system replacements, lending / leasing modernization, Cloud–native architecture (Containerization) etc. Wealth and Asset Management: We help our clients thrive in a transformative age by providing innovative services to global and domestic asset management clients to increase efficiency, effectiveness and manage the overall impact on bottom line profitability by leveraging the technology, data and digital teams. We do many operational efficiency programs and Technology Enabled Transformation to re-platform their front and Back offices with emerging technologies like AI, ML, Blockchain etc. Insurance Transformation: The current changing Macroeconomic trends continue to challenge Insurers globally. However, with disruptive technologies – including IoT, autonomous vehicles, Blockchain etc, we help companies through these challenges and create innovative strategies to transform their business through technology enabled transformation programs. We provide end to end services to Global P&C (General), Life and Health Insurers, Reinsurers and Insurance brokers. Cyber Security : The ever-increasing risk and complexity surrounding cybersecurity and privacy has put cybersecurity at the top of the agenda for senior management, the Board of Directors, and regulators. We help our clients to understand and quantify their cyber risk, prioritize investments, and embed security, privacy and resilience into every digitally-enabled initiative – from day one. Technology Risk: A practice that is a unique, industry-focused business unit that provides a broad range of integrated services where you’ll contribute technically to IT Risk and Assurance client engagements and internal projects. An important part of your role will be to actively establish, maintain and strengthen internal and external relationships. You’ll also identify potential business opportunities for EY within existing engagements and escalate these as appropriate. Similarly, you’ll anticipate and identify risks within engagements and share any issues with senior members of the team. Behavioral Competencies: Adaptive to team and fosters collaborative approach Innovative approach to the project, when required Shows passion and curiosity, desire to learn and can think digital Agile mindset and ability to multi-task Must have an eye for detail Skills needed: Should have understanding and/or experience of software development best practices and software development life cycle Understanding of one/more programming languages such as Java/ .Net/ Python, data analytics or databases such as SQL/ Oracle/ Teradata etc. Internship in a relevant technology domain will be an added advantage Qualification: BE - B. Tech / (IT/ Computer Science/ Circuit branches) Should have secured 60% and above No active Backlogs

Posted 1 week ago

Apply

7.0 years

20 - 21 Lacs

Chennai, Tamil Nadu, India

On-site

🔹 Job Title: Senior Software Engineer 34258 📍 Location: Chennai (Hybrid) 🕒 Experience: 5–7 Years 💼 Employment Type: Full-Time Duration: 12 Months Work Type: Onsite 💰 Salary: Up to ₹21 LPA ✅ Mandatory Skills Strong experience in Java/J2EE with GCP (minimum 2+ years in GCP) GCP, CI/CD, Docker PostgreSQL or other relational databases 🛠️ Key Responsibilities Develop REST-based microservices using Java, Spring Boot, Spring Cloud, Spring MVC, and related technologies Work in Java/J2EE development environments (e.g., IntelliJ, Eclipse) Develop rich web frontends using JavaScript frameworks such as Angular or React Build, test, and deploy containerized applications using Docker/Kubernetes and CI/CD tools (e.g., Jenkins, Tekton, GitHub Actions) Work extensively with cloud environments, especially Google Cloud Platform (GCP) (preferred), Azure, and OpenShift Implement infrastructure as code using Terraform Perform SQL and NoSQL data manipulation using tools like PostgreSQL, SQL Server, BigQuery, and Teradata Integrate real-time data streaming tools like Kafka and MQTT Apply Extreme Programming (XP) practices including paired programming, TDD (Test Driven Development), and lean methodologies Participate in source code management using GIT Ensure code quality by following Clean Code, Agile, and DevOps best practices 🌟 Preferred Qualifications Experience with Test-First Development (TDD) and eXtreme Programming (XP) Familiarity with Spring Boot microservices architecture Exposure to DevOps and Agile product delivery environments 🎓 Education Required: Bachelor's Degree in Computer Science or related field Preferred: Master’s Degree Skills: nosql,cloud,spring,tdd,angular,spring cloud,postgresql,j2ee,java,docker,kubernetes,terraform,mqtt,javascript,git,sql,kafka,gcp,react,spring boot,spring mvc,ci/cd

Posted 1 week ago

Apply

1.0 - 2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About the role Analyse complex datasets and make it consumable using visual storytelling and visualization tools such as reports and dashboards built using approved tools (Tableau, PyDash) You will be responsible for Following our Business Code of Conduct and always acting with integrity and due diligence and have these specific risk responsibilities: - Identifying operational improvements and finding solutions by applying CI tools and techniques - Responsible for completing tasks and transactions within agreed KPI's - Knows and applies fundamental work theories/concepts/processes in own areas of work -Engage with market partners to understand problems to be solved, translate the business problems to analytical problems, taking ownership of specified analysis and translate the answers back to decision makers in business - Manipulating, analyzing and synthesizing large complex data sets using different sources and ensuring data quality and integrity - Responsible for high quality and timely completion of specified work deliverables - Support automation of repeatable tasks, report generation or dashboard refresh - Think beyond the ask and develop analysis and reports that will contribute beyond basic asks - Write codes that are well detailed, structured, and compute efficient - Contribute to development of knowledge assets and reusable modules on GitHub/Wiki - Ability to generate practical insights that drive decisions in our business operations - Understands business needs and in depth understanding of Tesco processes You will need Basic understanding of Business Decisions, Basic Skills to 1-2 year Experience preferred in analytics delivery in any one of develop visualizations, self-service dashboards and reports using domains like retail, cpg, telecom or hospitality and for one of the Tableau & Basic Statistical Concepts (Correlation Analysis and following functional areas - marketing, supply chain, customer, Hyp. Testing), Basic skills to analyze data using Adv Excel, SQL, merchandising, operations, finance or digital preferredHive, Basic DW concepts on Hadoop and Teradata, Automation using alteryx/ python is good to have Whats in it for you? At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. About Us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

NTT DATA is looking for an Ab-Initio developer to join the team in Hyderabad, Telangana (IN-TG), India. As part of the team, you will be responsible for working with ETL tools such as Spark and AB-Initio, databases including HIVE, Teradata, and Oracle, and operating systems like UNIX. NTT DATA is a $30 billion global leader in business and technology services, serving 75% of the Fortune Global 100 companies. Committed to innovation and long-term success, NTT DATA provides consulting services, data and artificial intelligence solutions, industry-specific expertise, and application development, implementation, and management services. With a presence in over 50 countries and a diverse team of experts, NTT DATA is dedicated to helping clients optimize and transform their operations. As an Ab-Initio developer at NTT DATA, you will be part of a dynamic and inclusive organization that values innovation and forward-thinking. If you are passionate about technology and looking to grow your career with a global leader, apply now to join our team in Hyderabad and contribute to our mission of helping organizations navigate the digital future confidently and sustainably.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

panchkula, haryana

On-site

We are seeking a skilled and experienced Lead/Senior ETL Engineer with 4-8 years of experience to join our dynamic data engineering team. As a Lead/Sr. ETL Engineer, you will play a crucial role in designing and developing high-performing ETL solutions, managing data pipelines, and ensuring seamless integration across systems. Your expertise in ETL tools, cloud platforms, scripting, and data modeling principles will be pivotal in building efficient, scalable, and reliable data solutions for enterprise-level implementations. Key Skills: - Proficiency in ETL tools such as SSIS, DataStage, Informatica, or Talend. - In-depth understanding of Data Warehousing concepts, including Data Marts, Star/Snowflake schemas, and Fact & Dimension tables. - Strong experience with relational databases like SQL Server, Oracle, Teradata, DB2, or MySQL. - Solid scripting/programming skills in Python. - Hands-on experience with cloud platforms like AWS or Azure. - Knowledge of middleware architecture and enterprise data integration strategies. - Familiarity with reporting/BI tools such as Tableau and Power BI. - Ability to write and review high and low-level design documents. - Excellent communication skills and the ability to work effectively with cross-cultural, distributed teams. Roles and Responsibilities: - Design and develop ETL workflows and data integration strategies. - Collaborate with cross-functional teams to deliver enterprise-grade middleware solutions. - Coach and mentor junior engineers to support skill development and performance. - Ensure timely delivery, escalate issues proactively, and manage QA and validation processes. - Participate in planning, estimations, and recruitment activities. - Work on multiple projects simultaneously, ensuring quality and consistency in delivery. - Experience in Sales and Marketing data domains. - Strong problem-solving abilities with a data-driven mindset. - Ability to work independently and collaboratively in a fast-paced environment. - Prior experience in global implementations and managing multi-location teams is a plus. If you are a passionate Lead/Sr. ETL Engineer looking to make a significant impact in a dynamic environment, we encourage you to apply for this exciting opportunity. Thank you for considering a career with us. We look forward to receiving your application! For further inquiries, please contact us at careers@grazitti.com. Location: Panchkula, India,

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About VOIS VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VOIS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. About VOIS India In 2009, VOIS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VOIS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Description 4 to 6 years of experience in Test Management across the full SDLC in Telco Billing, Online and Mediation environments. ISTQB foundation certification in Software testing (or equivalent) as a minimum Experience of testing Billing/Mediation, CRM and / or Online Media web / WAP portals in a Telco environment Experience of testing Unix Software with Oracle databases, Web applications, IBM Message-Queues, Teradata applications and any Middleware Understanding of the basic CRM journeys in Telecom domain for the customer. Thorough knowledge and understanding of Unix Familiar in working with the XMLs and its structuring Thorough knowledge and understanding of SQL. Must have hands on experience with basic and intermediate SQL Efficient in understanding and maintenance of the Perl Scripts Preferably should be familiar with the Amdocs/Customer Billing skills Troubleshooting skills for error handling Experience in Test management Tools like ALM and RQM. Knowledge of structured test methods and processes Experience of System, Integration and Regression testing of Telco systems with complex user interfaces. VOIS Equal Opportunity Employer Commitment India VOIS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion, Top 10 Best Workplaces for Women, Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch!

Posted 1 week ago

Apply

1.0 - 2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About the role Analyse complex datasets and make it consumable using visual storytelling and visualization tools such as reports and dashboards built using approved tools (Tableau, PyDash) You will be responsible for Following our Business Code of Conduct and always acting with integrity and due diligence and have these specific risk responsibilities: - Identifying operational improvements and finding solutions by applying CI tools and techniques - Responsible for completing tasks and transactions within agreed KPI's - Knows and applies fundamental work theories/concepts/processes in own areas of work -Engage with market partners to understand problems to be solved, translate the business problems to analytical problems, taking ownership of specified analysis and translate the answers back to decision makers in business - Manipulating, analyzing and synthesizing large complex data sets using different sources and ensuring data quality and integrity - Responsible for high quality and timely completion of specified work deliverables - Support automation of repeatable tasks, report generation or dashboard refresh - Think beyond the ask and develop analysis and reports that will contribute beyond basic asks - Write codes that are well detailed, structured, and compute efficient - Contribute to development of knowledge assets and reusable modules on GitHub/Wiki - Ability to generate practical insights that drive decisions in our business operations - Understands business needs and in depth understanding of Tesco processes You will need Basic understanding of Business Decisions, Basic Skills to 1-2 year Experience preferred in analytics delivery in any one of develop visualizations, self-service dashboards and reports using domains like retail, cpg, telecom or hospitality and for one of the Tableau & Basic Statistical Concepts (Correlation Analysis and following functional areas - marketing, supply chain, customer, Hyp. Testing), Basic skills to analyze data using Adv Excel, SQL, merchandising, operations, finance or digital preferredHive, Basic DW concepts on Hadoop and Teradata, Automation using alteryx/ python is good to have Whats in it for you? At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. About Us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation

Posted 1 week ago

Apply

2.0 - 5.0 years

12 - 13 Lacs

Noida

Work from Office

Department: Emergency Response / Trauma Care Coordination Location: Central Command Centre, NHAI HQ or Designated Regional Centre Job Type: Full-time / Contractual (based on project) Job Purpose: To provide medical expertise and support for managing trauma care coordination across the National Highways network. The role involves real-time monitoring, triage support, emergency coordination with ambulances and hospitals, and supporting the implementation of NHAI's trauma care response system. Key Responsibilities: Command Centre Operations: Monitor and manage real-time data from highway accident alert systems. Coordinate with ambulance networks, local hospitals, and traffic police for emergency response. Ensure appropriate triage and patient routing to nearest suitable medical facility. Medical Triage and Advisory: Provide initial medical triage over calls or software dashboard. Support ambulance staff or first responders with medical guidance, if required. Data & Incident Management: Maintain records of incidents, response times, patient status, and follow-up outcomes. Identify patterns in accident data and provide input for preventive strategies. Coordination & Liaison: Coordinate with state health departments, AIIMS trauma centers, district hospitals, and NHAI field staff. Support the implementation of Standard Operating Procedures (SOPs) for trauma response. Training & Capacity Building: Train and support call center executives and ambulance staff in basic trauma protocols. Assist in simulation drills and mock exercises Qualifications: Essential: MBBS degree from a recognized institution. Valid registration with Medical Council of India (MCI) or State Medical Council. Desirable: Experience in Emergency Medicine / Trauma Care / ICU. Certification in Basic Life Support (BLS) / Advanced Trauma Life Support (ATLS) preferred. Experience: Minimum 2–5 years of clinical experience, preferably in emergency services or trauma care settings. Experience working in a command/control center or telemedicine setup is an advantage.

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Madurai, Chennai

Work from Office

Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: GCP Data Engineer Location: Madurai Experience: 5+ Years Notice Period: Immediate Job Summary We are seeking a hands-on GCP Data Engineer with deep expertise in real-time streaming data architectures to help design, build, and optimize data pipelines in our Google Cloud Platform (GCP) environment. The ideal candidate will have strong architectural vision and be comfortable rolling up their sleeves to build scalable, low-latency streaming data pipelines using Pub/Sub, Dataflow (Apache Beam) , and BigQuery . Key Responsibilities Architect and implement end-to-end streaming data solutions on GCP using Pub/Sub , Dataflow , and BigQuery . Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data. Work closely with stakeholders to understand data requirements and translate them into scalable designs. Optimize streaming pipeline performance, latency, and throughput. Build and manage orchestration workflows using Cloud Composer (Airflow) . Drive schema design, partitioning, and clustering strategies in BigQuery for both real-time and batch datasets. Define SLAs, monitoring, logging, and alerting for streaming jobs using Cloud Monitoring , Error Reporting , and Stackdriver . Experience with the data modeling. Ensure robust security, encryption, and access controls across all data layers. Collaborate with DevOps for CI/CD automation of data workflows using Terraform , Cloud Build , and Git . Document streaming architecture, data lineage, and deployment runbooks. Required Skills & Experience 5+ years of experience in data engineering or architecture. 3+ years of hands-on GCP data engineering experience. Strong expertise in: Google Pub/Sub Dataflow (Apache Beam) BigQuery (including streaming inserts) Cloud Composer (Airflow) Cloud Storage (GCS) Solid understanding of streaming design patterns , exactly-once delivery , and event-driven architecture . Deep knowledge of SQL and NoSQL data modeling. Hands-on experience with monitoring and performance tuning of streaming jobs. Experience using Terraform or equivalent for infrastructure as code. Familiarity with CI/CD pipelines for data workflows.

Posted 1 week ago

Apply

4.0 - 7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description- We are committed to creating a workplace for the industry’s best talent. The Smart Cube (A WNS Company) is proud to be certified as a ‘Great Place to Work’ for the fifth year running. The Smart Cube is also recognized by Great Place to Work as One of India’s Best Workplaces for Women 2021. The Smart Cube, a global provider of strategic research and analytics solutions, has been rated on Analytics India Magazine’s (AIM) Penetration and Maturity Quadrant of Top Data Science Providers as a “Seasoned Vendor” 2022 report amongst the leading analytics service providers based out of India. We are listed in top 50 data science organization The Smart Cube shortlisted for two awards at the British Data Awards. Our clients include a third of the companies in the FTSE and Fortune 100, primarily in the CPG, Life Sciences, Energy, Chemicals, Industrials, Financial Services, Professional Services, and Retail sectors. Roles and responsibilities Specifically, Assistant Managers should – Understand the client objectives, and work with the Project Lead (PL) to design the analytical solution/framework. Be able to translate the client objectives / analytical plan into clear deliverables with associated priorities and constraints Organize/Prepare/Manage data and conduct quality checks to ensure that the analysis dataset is ready Explore and implement various statistical and analytical techniques (including machine learning) like linear/non-linear Regression, Decision Trees, Segmentation, time series forecasting as well as machine learning algorithms like Random Forest, SVM, ANN, etc. Conduct sanity checks of the analysis output based on reasoning and common sense, and be able to do a rigorous self QC, as well as of the work assigned to junior analysts to ensure an error free output Interpret the output in context of the client’s business and industry to identify trends and actionable insights Be able to take client calls relatively independently, and interact with onsite leads (if applicable) on a daily basis Discuss queries/certain sections of deliverable report over client calls or video conferences Oversee the entire project lifecycle, from initiation to closure, ensuring timely and within-budget delivery. Collaborate with stakeholders to gather and refine business requirements, translating them into technical specifications. Manage a team of data analysts and developers, providing guidance, mentorship, and performance evaluations. Ensure data integrity and accuracy through rigorous data validation and quality checks. Facilitate effective communication between technical teams and business stakeholders to align project goals and expectations. Drive continuous improvement initiatives to enhance data analytics processes and methodologies. Act as a project lead, coordinating cross-functional teams and managing project timelines and deliverables. Client Management Act as client lead and maintain client relationship; make independent key decisions related to client management Be a part of deliverable discussions with clients over telephonic calls, and guide the project team on the next steps and way forward Technical Requirements: Knowledge of how to connect Database with Knime e.g. snowflake, SQL db etc. along with SQL concepts like types of joins/union of data etc. Read data from a DB and write it back to a database Working of macros to avoid repetition of task, and enabling schedulers to run work flow(s) Design and develop ETL workflows and datasets in Knime to be used by the BI Reporting tool Perform end to end Data validation and prepare technical specifications and documentation for Knime workflows supporting BI reports. Develop and maintain interactive dashboards and reports using PowerBI to support data-driven decision-making. Lead and manage data analytics projects utilizing PowerBI, Python, and SQL to guide & deliver actionable business insights. Be able to succinctly visualize the findings through a PPT, a BI dashboard (Tableau, Qlikview, etc.) and highlight the key takeaways from a business perspective Ideal Candidate 4-7 years of relevant advanced analytics experience in Marketing, CRM, Pricing in either Retail, or CPG industries. Other B2C domains can be considered Experience in managing, cleaning and analyzing large datasets using tools like Python, R or SAS Experience in using multiple advanced analytics techniques or machine learning algorithms Experience in handling client calls and working independently with clients Understanding of consumer businesses such as Retail, CPG or Telecom Knowledge of working across multiple data types and files like flat files, RDBMS files; Knime workflows, Knime server, and multiple data platforms (SQL Server, Teradata, Hadoop, Spark); on premise or on the cloud Basic knowledge of advanced statistical techniques like Decision trees, different types of regressions, clustering, Forecasting (ARIMA/X), ML, etc. Other Skills Excellent communication skills (both written and oral) Ability to create client ready deliverables in Excel and PowerPoint Optimization techniques (linear, non-linear), and knowledge of supply chain VBA, Excel Macro programming, Tableau, QlikView Education Engineers from top tier institutes (IITs, DCE/NSIT, NITs) or Post Graduates in Maths/Statistics/OR from top Tier Colleges/Universities MBA from top tier B-schools In interested, please share your updated CV on kiran.meghani@wns.com or apply on https://smrtr.io/sz4-S Looking for immediate OR early joiners

Posted 1 week ago

Apply

13.0 - 20.0 years

35 - 70 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Required Skills and Experience 13+ Years is a must with 7+ years of relevant experience working on Big Data Platform technologies. Proven experience in technical skills around Cloudera, Teradata, Databricks, MS Data Fabric, Apache, Hadoop, Big Query, AWS Big Data Solutions (EMR, Redshift, Kinesis, Qlik) Good Domain Experience in BFSI or Manufacturing area . Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams independently. Work with both internal/external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Note: If you have experience in BFSI Domain than the location will be Mumbai only If you have experience in Manufacturing Domain the location will be Mumbai & Bangalore only. Interested candidates can share their updated resumes on shradha.madali@sdnaglobal.com

Posted 1 week ago

Apply

6.0 years

0 Lacs

Delhi, India

On-site

Job Title: Lead Azure Data Engineer Experience Level: Mid - Senior Level Location: Delhi Duration: Fulltime Experience Required: 6-8+ Years Description: We are seeking a highly skilled and experienced Lead Azure Data Engineer to join our team. The ideal candidate will have a strong background in data engineering, with a focus on working with Databricks, PySpark, Scala-Spark, and advanced SQL. This role requires hands-on experience in implementing or migrating projects to Unity Catalog, optimizing performance on Databricks Spark, and orchestrating workflows using various tools. Must Have Skills: MS Fabric ADF (Azure Data Factory) Azure Synapse Key Responsibilities: Data engineering and analytics project delivery experience Minimum 6 years Min. 2 project done in past of Databricks Migration (Ex. Hadoop to Databricks, Teradata to Databricks, Oracle to Databricks, Talend to Databricks etc) Hands on with Advanced SQL and Pyspark and/or Scala Spark Min 3 project done in past on Databricks where performance optimization activity was done Design, develop, and optimize data pipelines and ETL processes using Databricks and Apache Spark. Implement and optimize performance on Databricks Spark, ensuring efficient data processing and management. Develop and validate data formulation and data delivery for Big Data projects. Collaborate with cross-functional teams to define, design, and implement data solutions that meet business requirements. Conduct performance tuning and optimization of complex queries and data models. Manage and orchestrate data workflows using tools such as Databricks Workflow, Azure Data Factory (ADF), Apache Airflow, and/or AWS Glue. Maintain and ensure data security, quality, and governance throughout the data lifecycle. Technical Skills: • Extensive experience with PySpark and Scala-Spark. • Advanced SQL skills for complex data manipulation and querying. • Proven experience in performance optimization on Databricks Spark across at least three projects. • Hands-on experience with data formulation and data delivery validation in Big Data projects. • Experience in data orchestration using at least two of the following: Databricks Workflow, Azure Data Factory (ADF), Apache Airflow, AWS Glue. • Experience in Azure Synapse

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Lowe’s is a FORTUNE® 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. Lowe’s India, the Global Capability Center of Lowe’s Companies Inc., is a hub for driving our technology, business, analytics, and shared services strategy. Based in Bengaluru with over 4,500 associates, it powers innovations across omnichannel retail, AI/ML, enterprise architecture, supply chain, and customer experience. From supporting and launching homegrown solutions to fostering innovation through its Catalyze platform, Lowe’s India plays a pivotal role in transforming home improvement retail while upholding strong commitment to social impact and sustainability. For more information, visit Lowes India About The Team The Financial Decision Support & Analytics (FDS&A) team is responsible for providing analyses, insights and reporting that help identify drivers of performance and guide appropriate tactical actions to improve financial outcomes. The team partners with Retail Finance teams in the US for functions such as Global FP&A, Store Operations, Merchandising and Supply Chain. We are seeking a dynamic and passionate Senior Analyst to support the Credit FP&A team within FDS&A. This role will be responsible for partnering directly with Credit FP&A leaders and solve for complex initiatives which includes measurement of credit promotions and analysis that delivers insights on initiative performance. The role helps enable the business partners to make informed decisions based on root cause drivers of performance by working with functional partners. The successful candidate will be a creative thinker with a strong bias to action and can collaborate with our US partners. This role requires a self-starter with strong analytical and problem-solving skills. The candidate should have a proven track record of influencing and supporting decisions in a fast-paced, high-energy environment. Key Responsibilities Execute on performance measurement frameworks of various credit promotions with minimal supervision Proactively explore new ways to analyze enterprise credit initiatives by understanding business objectives Quickly grasps the business problem and translates into tangible, usable actionable outputs Design, develop & deliver insightful measurement reporting and dashboards through automation Perform function specific deep dives to understand root cause Provide tactical support for key Executive meetings Work with large, complex and disparate data sources to draw meaningful inferences and insights Required Qualifications Bachelor's/Master’s in Business, Finance, Science, Engineering or related Quantitative fields Hands-on experience in ideating and executing analysis. 4-6 years of prior experience in a finance or analytical role, preferably 1-2 years in Retail Advanced knowledge of Excel (Power Pivots, Power Query, VBA) and PowerPoint Exceptional analytical and critical thinking skills; highly detail oriented Experience with business intelligence and visualization tools (MicroStrategy, Teradata, PowerBI, TM1) Strong data collection, collation and cleansing skills using tools like SQL, Python and Big Query Demonstrated ability to work independently in a fast-paced environment and manage multiple competing priorities Excellent communication skills – being able to communicate effectively - upward, peers and offshore teams Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law. Starting rate of pay may vary based on factors including, but not limited to, position offered, location, education, training, and/or experience. For information regarding our benefit programs and eligibility, please visit https://talent.lowes.com/us/en/benefits.

Posted 1 week ago

Apply

1.0 - 6.0 years

15 - 17 Lacs

Pune

Work from Office

Join us as a Data Engineer at Barclays, where youll take part in the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. Youll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Data Engineer you should have experience with: Strong experience with ETL tools such as Ab Initio, Glue, PySpark, Python, DBT, DataBricks and various AWS required services / products. Advanced SQL knowledge across multiple database platforms (Teradata , Hadoop, SQL etc. ) Experience with data warehousing concepts and dimensional modeling. Proficiency in scripting languages (Python, Perl, Shell scripting) for automation. Knowledge of big data technologies (Hadoop, Spark, Hive) is highly desirable. Bachelors degree in Computer Science, Information Systems, or related field. Experience in ETL development and data integration. Proven track record of implementing complex ETL solutions in enterprise environments. Experience with data quality monitoring and implementing data governance practices. Knowledge of cloud data platforms (AWS, Azure, GCP) and their ETL services . Some other highly valued skills include: Strong analytical and problem-solving skills. Ability to work with large and complex datasets. Excellent documentation skills. Attention to detail and commitment to data quality. Ability to work independently and as part of a team. Strong communication skills to explain technical concepts to non-technical stakeholders. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave.

Posted 1 week ago

Apply

5.0 years

4 - 7 Lacs

Hyderābād

On-site

Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role : The WBD Integration team is seeking a Senior Integration Developer who will be responsible for providing technical expertise for supporting and enhancing the Integration suite (Informatica Power Center, IICS). We are specifically looking for a candidate with solid technical skills and experience in integrating ERP applications, SAAS, PAAS platforms such as SAP, Salesforce, Workday, etc., and data warehouses such as Teradata, Snowflake, and RedShift. Experience with the Informatica cloud platform would be ideal for this position. The candidate's primary job functions include but are not limited to the day-to-day configuration/development of the Informatica platform. The candidate must possess strong communication and analytical skills to effectively work with peers within the Enterprise Technology Group, various external partners/vendors, and business users to determine requirements and translate them into technical solutions. The candidate must have the ability to independently complete individual tasks in a dynamic environment to achieve departmental and company goals. Qualifications & Experiences: Leads the daily activities of a work group / team. Typically leads complex professional undertakings or teams. May be assigned to new work groups or teams. Interacts with peers and other internal and external stakeholders regularly. Demonstrates advanced proficiency in full range of skills required to perform the role. Acts as a Mentor Work on POC with new connectors and closely work with Network team. Performing peer review of objects developed by other developers in team as needed. Work with Business and QA Team during various phases of deployment i.e., requirements, QAST, SIT phases Report any Functional gaps in existing Application and suggest business process improvements and support for bug fixes and issues reported Coordinating activities between the different LOB’s/teams Translate conceptual system requirements into technical data and integration requirements Proficient in using Informatica Cloud application and data integrations is a must Proficient in developing custom API’s to handle bulk volumes, pagination etc. Design, develop, and implement integration solutions using Informatica Intelligent Cloud Services. Configure data mappings, transformations, and workflows to ensure data consistency and accuracy. Develop and maintain APIs and connectors to integrate with various data sources and applications. Prepare data flow diagramming and/or process modeling Strong knowledge of integration protocols and technologies (e.g., REST, SOAP, JSON, XML). Perform Unit Testing and debugging of applications to ensure the quality of the delivered requirements and overall health of the system Develop standards and processes to support and facilitate integration projects and initiatives. Educate other team members and govern tool usage Participate in research and make recommendations on the integration products and services Monitor integration processes and proactively identify and resolve performance or data quality issues. Provide ongoing maintenance and support for integration solutions. Perform regular updates and upgrades to keep integrations current. Proficiency in Informatica Intelligent Cloud Services. Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Excellent problem-solving and troubleshooting skills. Strong communication and teamwork skills. Qualifications & Experiences: 5+ year's Developer Experience in Informatica IICS Application Integration and Data Integration. Experience in PowerCenter along with IICS and complete knowledge of SDLC Process Experience in API development, including best practices, testing methods and deployment strategies. Experience in designing, creating, refining, deploying, and managing the organization's data architecture including the end-to-end vision for how data will flow from system to system, for multiple applications and across different territory. Expertise in with the tools like SOA, ETL, ERP, XML etc Understanding of Python, AWS Redshift, Snowflake and Relational Databases Knowledge of UNIX Shell scripts and should be able to write/debug Shell Scripts. Ability to work well within an agile team environment and apply related working methods. Able to analyze and understand complex customer scenario's and thrives on difficult challenges Team player, multitasker, excellent communication skills (convey highly technical information into business terms, clear email communications), ability to mentor team members. Preferred Qualifications: Informatica certification in Informatica Intelligent Cloud Services. Experience with other integration tools and middleware. Knowledge of data governance and data quality best practices. Not Required but preferred experience: Public speaking and presentation skills. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderābād

On-site

Design and build ETL solutions with experience in data engineering, data modelling in large-scale in both batch and real-time environments. Skills required: Python, PySpark, Apache Spark, Unix Shell Scripting, GCP, Big query, MongoDB, Kafka event streaming, API development, CI/CD. Additional skills: Teradata, SQL Server, Ab Initio

Posted 1 week ago

Apply

0 years

1 - 9 Lacs

Gurgaon

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Design and test DataStage jobs Write complex SQL queries; Teradata experience is advantageous Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Experience with Spark, Scala, and Kafka is beneficial Experience with data governance and security protocols Solid knowledge of Unix, Shell, Perl and Python scripting Familiarity with Airflow for scheduling Understanding of DevOps, Agile methodologies, and CI/CD pipelines Knowledge of testing and automation processes Preferred Qualification: Experience in Azure and other cloud technologies along with Snowflake and Databricks At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 1 week ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Design and test DataStage jobs Write complex SQL queries; Teradata experience is advantageous Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Experience with Spark, Scala, and Kafka is beneficial Experience with data governance and security protocols Solid knowledge of Unix, Shell, Perl and Python scripting Familiarity with Airflow for scheduling Understanding of DevOps, Agile methodologies, and CI/CD pipelines Knowledge of testing and automation processes Preferred Qualification Experience in Azure and other cloud technologies along with Snowflake and Databricks At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Job Description: KPI Partners is seeking an experienced Senior Snowflake Administrator to join our dynamic team. In this role, you will be responsible for managing and optimizing our Snowflake environment to ensure performance, reliability, and scalability. Your expertise will contribute to designing and implementing best practices to facilitate efficient data warehousing solutions. Key Responsibilities: - Administer and manage the Snowflake platform, ensuring optimal performance and security. - Monitor system performance, troubleshoot issues, and implement necessary solutions. - Collaborate with data architects and engineers to design data models and optimal ETL processes. - Conduct regular backups and recovery procedures to protect data integrity. - Implement user access controls and security measures to safeguard data. - Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. - Participate in the planning and execution of data migration to Snowflake. - Provide support for data governance and compliance initiatives. - Stay updated with Snowflake features and best practices, and provide recommendations for continuous improvement. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 5+ years of experience in database administration, with a strong focus on Snowflake. - Hands-on experience with SnowSQL, SQL, and data modeling. - Familiarity with data ingestion tools and ETL processes. - Strong problem-solving skills and the ability to work independently. - Excellent communication skills and the ability to collaborate with technical and non-technical stakeholders. - Relevant certifications in Snowflake or cloud data warehousing are a plus. If you are a proactive, detail-oriented professional with a passion for data and experience in Snowflake administration, we would love to hear from you. Join KPI Partners and be part of a team that is dedicated to delivering exceptional data solutions for our clients.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka

On-site

- 2+ years of processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) experience - 2+ years of relational database technology (such as Redshift, Oracle, MySQL or MS SQL) experience - 2+ years of developing and operating large-scale data structures for business intelligence analytics (using ETL/ELT processes) experience - 5+ years of data engineering experience - Experience managing a data or BI team - Experience communicating to senior management and customers verbally and in writing - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS As a Data Engineering Manager, you will lead a team of data engineers, front end engineers and business intelligence engineers. You will own our internal data products (Yoda), transform to AI, build agents and scale them for IN and emerging stores. You will provide technical leadership, drive application and data engineering initiatives and build end-to-end data solutions that are highly available, scalable, stable, secure, and cost-effective. You strive for simplicity, demonstrate creativity with sound judgement. You deliver data & reporting solutions that are customer focused, easy to consume and create business impact. You are passionate about working with huge datasets and have experience with the organization and curation of data for analytics. You have a strategic and long-term view on architecture of advanced data eco systems. You are experienced in building efficient and scalable data services and have the ability to integrate data systems with AWS tools and services to support a variety of customer use cases/applications Key job responsibilities • Lead a team of data engineers, front end engineers and business intelligence engineers to deliver cross-functional, data and application engineering projects for Databases, Analytics and AI/ML services, • Establish and clearly communicate organizational vision, goals and success measures, • Collaborate with business stakeholders to develop roadmap and product requirements, • Build, Own, Prioritize, Lead and Deliver a roadmap of large and complex multi-functional projects and programs, • Manage AWS infrastructure, IMR cost and RDS/Dynamo instances • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources, • Own the design, development, and maintenance of metrics, reports, dashboards, etc. to drive key business decisions. About the team CoBRA is the Central BI Reporting and Analytics org for IN stores and AI partner for International emerging stores . CoBRA team's mission is to empower Category and Seller orgs including Brand, Account, marketing and product/program teams with self-service products using AI (Yoda and bedrock agents), build actionable insights (Quicksight Q, Cutstom agents, Q- business) and help them make faster and smart decisions using science solutions across Amazon fly wheel on all inputs (Selection, Pricing and Speed). Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with AWS Tools and Technologies (Redshift, S3, EC2) Knowledge of building AI tools, AWS bedrock agents, LLM/foundational models Experience in supporting ML models for data needs Exposure to prompt engineering and upcoming AI technologies and its landscape Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

10.0 - 15.0 years

20 - 35 Lacs

Noida, Bengaluru

Work from Office

Description: We are looking for a Python Developer with working knowledge of ETL workflow. Experience in data extraction using APIs and writing queries in PostgreSQL is mandatory. Requirements: Need a Python that has good EXperience in Python programming and problem solving Should be good in Data Structure and implementation. Shoudl be good in Data base i.e. relation Database and SQL. Should be proficient in requirements and implementation Should have a degreee in Computer science Should have good communication, prioritization, organization skills Should be keen on learning and upskilling Job Responsibilities: Need a Python that has good Experiencein Python programming and problem solving Should be good in Data Structure and implementation. Shoudl be good in Data base i.e. relation Database and SQL. Should be proficient in requirements and implementation Should have a degreee in Computer science Should have good communication, prioritization, organization skills Should be keen on learning and upskilling What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 1 week ago

Apply

6.0 - 11.0 years

40 - 45 Lacs

Hyderabad

Work from Office

We are seeking a Reference Data Management Senior Analyst who a s the Reference Data Product team member of the Enterprise Data Management organization, will be responsible for managing and promoting the use of reference data, partnering with business Subject Mater Experts on creation of vocabularies / taxonomies and ontologies, and developing analytic solutions using semantic technologies . Roles & Responsibilities: Work with Reference Data Product Owner, external resources and other engineers as part of the product team Develop and maintain semantically appropriate concepts Identify and address conceptual gaps in both content and taxonomy Maintain ontology source vocabularies for new or edited codes Support product teams to help them leverage taxonomic solutions Analyze the data from public/internal datasets. Develop a Data Model/schema for taxonomy. Create a taxonomy in Semaphore Ontology Editor. Perform Bulk-import data templates into Semaphore to add/update terms in taxonomies. Prepare SPARQL queries to generate adhoc reports. Perform Gap Analysis on current and updated data Maintain taxonomies in Semaphore through Change Management process. Develop and optimize automated data ingestion / pipelines through Python/ PySpark when APIs are available Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Identify and resolve complex data-related challenges Participate in sprint planning meetings and provide estimations on technical implementation . Basic Qualifications and Experience: Master s degree with 6 years of experience in Business, Engineering, IT or related field OR Bachelor s degree with 8 years of experience in Business, Engineering, IT or related field OR Diploma with 9+ years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Knowledge of controlled vocabularies, classification, ontology and taxonomy Experience in ontology development using Semaphore, or a similar tool Hands on experience writing SPARQL queries on graph data Excellent problem-solving skills and the ability to work with large, complex datasets Understanding of data modeling, data warehousing, and data integration concepts Good-to-Have Skills: Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience using cloud services such as AWS or Azure or GCP Experience working in Product Teams environment Knowledge of Python/R, Databricks, cloud data platforms Knowledge of NLP (Natural Language Processing) and AI (Artificial Intelligence) for extracting and standardizing controlled vocabularies. Strong understanding of data governance frameworks, tools, and best practices Professional Certifications : Databricks Certificate preferred SAFe Practitioner Certificate preferred Any Data Analysis certification (SQL , Python ) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies