Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Key Responsibilities Work closely with clients to understand their business requirements and design data solutions that meet their needs. Develop and implement end-to-end data solutions that include data ingestion, data storage, data processing, and data visualization components. Design and implement data architectures that are scalable, secure, and compliant with industry standards. Work with data engineers, data analysts, and other stakeholders to ensure the successful delivery of data solutions. Participate in presales activities, including solution design, proposal creation, and client presentations. Act as a technical liaison between the client and our internal teams, providing technical guidance and expertise throughout the project lifecycle. Stay up-to-date with industry trends and emerging technologies related to data architecture and engineering. Develop and maintain relationships with clients to ensure their ongoing satisfaction and identify opportunities for additional business. Understands Entire End to End AI Life Cycle starting from Ingestion to Inferencing along with Operations. Exposure to Gen AI Emerging technologies. Exposure to Kubernetes Platform and hands on deploying and containorizing Applications. Good Knowledge on Data Governance, data warehousing and data modelling. Requirements Bachelor's or Master's degree in Computer Science, Data Science, or related field. 10+ years of experience as a Data Solution Architect, with a proven track record of designing and implementing end-to-end data solutions. Strong technical background in data architecture, data engineering, and data management. Extensive experience on working with any of the hadoop flavours preferably Data Fabric. Experience with presales activities such as solution design, proposal creation, and client presentations. Familiarity with cloud-based data platforms (e.g., AWS, Azure, Google Cloud) and related technologies such as data warehousing, data lakes, and data streaming. Experience with Kubernetes and Gen AI tools and tech stack. Excellent communication and interpersonal skills, with the ability to effectively communicate technical concepts to both technical and non-technical audiences. Strong problem-solving skills, with the ability to analyze complex data systems and identify areas for improvement. Strong project management skills, with the ability to manage multiple projects simultaneously and prioritize tasks effectively. Tools and Tech Stack Hadoop Ecosystem Data Architecture and Engineering: Preferred: Cloudera Data Platform (CDP) or Data Fabric. Tools: HDFS, Hive, Spark, HBase, Oozie. Data Warehousing Cloud-based: Azure Synapse, Amazon Redshift, Google Big Query, Snowflake, Azure Synapsis and Azure Data Bricks On-premises: , Teradata, Vertica Data Integration And ETL Tools Apache NiFi, Talend, Informatica, Azure Data Factory, Glue. Cloud Platforms Azure (preferred for its Data Services and Synapse integration), AWS, or GCP. Cloud-native Components Data Lakes: Azure Data Lake Storage, AWS S3, or Google Cloud Storage. Data Streaming: Apache Kafka, Azure Event Hubs, AWS Kinesis. HPE Platforms Data Fabric, AI Essentials or Unified Analytics, HPE MLDM and HPE MLDE AI And Gen AI Technologies AI Lifecycle Management: MLOps: MLflow, KubeFlow, Azure ML, or SageMaker, Ray Inference tools: TensorFlow Serving, K Serve, Seldon Generative AI Frameworks: Hugging Face Transformers, LangChain. Tools: OpenAI API (e.g., GPT-4) Kubernetes Orchestration and Deployment: Platforms: Azure Kubernetes Service (AKS)or Amazon EKS or Google Kubernetes Engine (GKE) or Open Source K8 Tools: Helm CI/CD For Data Pipelines And Applications Jenkins, GitHub Actions, GitLab CI, or Azure DevOps Show more Show less
Posted 1 month ago
8 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Strategy and Transactions (SaT)– DnA Assistant Director EY’s Data n’ Analytics team is a multi-disciplinary technology team delivering client projects and solutions across Data Management, Visualization, Business Analytics and Automation. The assignments cover a wide range of countries and industry sectors. The opportunity We’re looking for Assistant Director - Data Engineering. The main objective of the role is to support cloud and on-prem platform analytics and data engineering projects initiated across engagement teams. The role will primarily involve conceptualizing, designing, developing, deploying and maintaining complex technology solutions which help EY solve business problems for the clients. This role will work closely with technical architects, product and business subject matter experts (SMEs), back-end developers and other solution architects and is also on-shore facing. This role will be instrumental in designing, developing, and evolving the modern data warehousing solutions and data integration build-outs using cutting edge tools and platforms for both on-prem and cloud architectures. In this role you will be coming up with design specifications, documentation, and development of data migration mappings and transformations for a modern Data Warehouse set up/data mart creation and define robust ETL processing to collect and scrub both structured and unstructured data providing self-serve capabilities (OLAP) in order to create impactful decision analytics reporting. Your Key Responsibilities Evaluating and selecting data warehousing tools for business intelligence, data population, data management, metadata management and warehouse administration for both on-prem and cloud-based engagements Strong working knowledge across the technology stack including ETL, ELT, data analysis, metadata, data quality, audit and design Design, develop, and test in ETL tool environment (GUI/canvas driven tools to create workflows) Experience in design documentation (data mapping, technical specifications, production support, data dictionaries, test cases, etc.) Provides technical leadership to a team of data warehouse and business intelligence developers Coordinate with other technology users to design and implement matters of data governance, data harvesting, cloud implementation strategy, privacy, and security Adhere to ETL/Data Warehouse development Best Practices Responsible for Data orchestration, ingestion, ETL and reporting architecture for both on-prem and cloud (MS Azure/AWS/GCP) Assisting the team with performance tuning for ETL and database processes Skills And Attributes For Success 12-14 years of total experience with 8+ years in Data warehousing/ Business Intelligence field Solid hands-on 8+ years of professional experience with creation and implementation of data warehouses on client engagements and helping create enhancements to a data warehouse Strong knowledge of data architecture for staging and reporting schemas, data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) 5+ years’ experience in Azure database offerings [ Relational, NoSQL, Datawarehouse] 5+ years hands-on experience in various Azure services preferred – Azure Data Factory, Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics, Azure Analysis Services & Databricks Minimum of 8 years of hands-on database design, modeling and integration experience with relational data sources, such as SQL Server databases, Oracle/MySQL, Azure SQL and Azure Synapse Knowledge and direct experience using business intelligence reporting tools (Power BI, Alteryx, OBIEE, Business Objects, Cognos, Tableau, MicroStrategy, SSAS Cubes etc.) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills Strong project and people management skills with experience in serving global clients To qualify for the role, you must have Master’s Degree in Computer Science, Business Administration or equivalent work experience. Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analysing large volumes of data Relevant work experience of minimum 12 to 14 years in a big 4 or technology/ consulting set up Help incubate new finance analytic products by executing Pilot, Proof of Concept projects to establish capabilities and credibility with users and clients. This may entail working either as an independent SME or as part of a larger team Ideally, you’ll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Strong experience in SQL server and MS Excel plus atleast one other SQL dialect e.g. MS Access\Postgresql\Oracle PLSQL\MySQLStrong in Data Structures & Algorithm Experience of interfacing with databases such as Azure databases, SQL server, Oracle, Teradata etc Preferred exposure to JSON, Cloud Foundry, Pivotal, MatLab, Spark, Greenplum, Cassandra, Amazon Web Services, Microsoft Azure, Google Cloud, Informatica, Angular JS, Python, etc. Experience with Snowflake What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Greetings from TCS! TCS is hiring for Informatica Developer Desired Experience Range: 4 - 6 Years Job Location: Pune/Mumbai/Chennai Must have skills* 1.Problem assessment and resolution in existing solutions. 2.Monitoring and performance tuning & Participation in all phases of development lifecycle and post implementation support. 3.Collaboration and knowledge sharing. 4.Communicate and leverage Teradata best practices. 5.Provide Teradata Architecture duties for a very large Teradata environment 6.Teradata Explain Analysis of query impact and provide on-call support duties for Application Production Support, which could include off hours and weekends. 7.Design, develop and support Enterprise Data Warehouse solutions leveraging Teradata and Teradata Tools and Utilities. Key responsibilities* 1.5+ years of experience in with Teradata application Development 2.Strong knowledge of Teradata load utilities – Fast Load, Multi-Load, Fast Export, TPump, and BTEQ 3.Very Strong in SQL and Data Analysis, performance tuning and code reviews. 4.Strong knowledge of Teradata stored procedures, macros and functions. 5.Business knowledge in the manufacturing domain. Regards Monisha Show more Show less
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About This Role "Wells Fargo is seeking a Systems Operations Engineer. We believe in the power of working together because great ideas can come from anyone. Through collaboration, any employee can have an impact and make a difference for the entire company. Explore opportunities with us for a career in a supportive environment where you can learn and grow. In This Role, You Will Participate in complex technical issues and initiatives related to large scale applications, systems, databases, or other technical products and services Identify opportunity for process improvements within technical support strategies and plans Review and analyze technical queries to extract data, create standard databases, or perform limited programming to fine tune systems supporting low to medium risk technical deliverables Present recommendations for resolving complex technical queries Exercise some independent judgment to analyze performance trends and recommend process improvements while developing understanding of technical process controls or standards Work as an internal consultant regarding use of tools and processes Provide information related to supported system area to functional colleagues, internal partners and stakeholders, including internal customers Required Qualifications: 2+ years of Systems Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Bachelor/Master's degree with Experience in Application Production Support background. Job Expectations: Incident Management - Identifying potential incidents proactively, triage, document and remediate incidents to recover application and batches to meet agreed SLA's with business partners. Problem Management - Identify, document, co-ordinate with relevant stakeholders to remediate recurring incidents. Oversee and ensure nightly batch completes within agreed business SLA's. Fixing the production issues to have zero business impact, and ensuring that the SLAs are being met. Following documented escalation procedures. Performing application health checks and ensuring that all applications are green before the start of business. Documenting known production incidents into knowledge base. Release co-ordination with stakeholders to ensure smooth deployment into production. Participate in BCP/DR activity to ensure BCP readiness of production environment. Ensuring all the applications and systems are up and running fine for customers. Handling all production support escalations and MIM (Major Incident Management) calls during India day hours and resolving the issue with minimum TTR. Guiding your own and partner teams during MIM call. Doing trend analysis and taking proactive measures for frequently occurring issues. Performing planned and unplanned BCP/DR activities for the application and coordinating during the event. Setting up monitoring for application, server, batch job etc. using different monitoring tools like AppDynamics, Splunk, SiteScope, APM. Troubleshooting Application and Autosys Batch job issues. Game Plan review and Coordination of CR Implementation/Execution. Build efficient and high performing team who can work independently on all the support activities, and plan/manage the support coverage provided. Process streamlining for Incident management, Change management, Problem management. Motivate team members and promote a strong sense of support, ownership, and urgency Facilitate communications within the team and facilitate knowledge sharing amongst the team and other personnel. Drive automation to reduce the manual efforts performed by the team. Excellent verbal and written communication skills and ability to communicate with US partners directly. Ability to build relationships, trust, and credibility from the outset, with a strong focus on effective customer service. Initiate, facilitate and coordinate bridge line activities as needed during critical production issues Good problem solving and analytical skills Participating in applications/systems upgrade testing and updating the relevant documentation The flexibility to work in ad-hoc shifts when required. The ability to understand team dynamics and use interpersonal skills and personal judgment to achieve goals Proven knowledge of diagnostic and support tools used in the application support environment Experience in Systems Engineering, or equivalent demonstrated through one or a combination of the following: work experience, training, education Bachelor/master's degree with Experience in Application Production Support background. Must have good knowledge of UNIX commands, Job scheduler (preferably Autosys), DB Technologies like Oracle, SSIS, SQL Server Knowledge on ETL Tools like Teradata/Informatica and other ETL tools will be an added advantage. Good understanding of ITIL concepts like - Incident, Problem and change management. Ability to communicate clearly and concisely to stakeholders. Monitoring tools like ITRS Geneos, Grafana, Thousand Eyes, scripting knowledge - Shell/Perl/PowerShell and/or any ETL experience is desired. Good understanding and experience on cloud support - PCF & Azure Good understanding of different monitoring tools like AppDynamics, Splunk, SiteScope, APM Shift Time 06:30 AM IST - 3:30 PM IST and 13:30 PM IST - 22:30 PM IST (We work in two shifts and there is no night shift). Posting End Date: 22 May 2025 Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-458708 Show more Show less
Posted 1 month ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Jole Overview We are seeking a highly skilled and motivated Data Modeler with 3+ years | Hyderabad/ Gurgaon of experience. If you're passionate about coding, problem-solving, and innovation, we'd love to hear from you! About Us CodeVyasa is a mid-sized product engineering company that works with top-tier product/solutions companies such as McKinsey, Walmart, RazorPay, Swiggy, and others. We are about 550+ people strong and we cater to Product & Data Engineering use-cases around Agentic AI, RPA, Full-stack and various other GenAI areas. Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, Spark, Data Bricks Delta Lakehouse or other Cloud data warehousing technologies. Governs data design/modelling – documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Develop a deep understanding of the business domains like Customer, Sales, Finance, Supplier, and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Drive collaborative reviews of data model design, code, data, security features to drive data product development. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; SAP Data Model. Develop reusable data models based on cloud-centric, code-first approaches to data management and data mapping. Partner with the data stewards team for data discovery and action by business customers and stakeholders. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Assist with data planning, sourcing, collection, profiling, and transformation. Support data lineage and mapping of source system data to canonical data stores. Create Source to Target Mappings (STTM) for ETL and BI developers. Skills needed: Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail. Why Join CodeVyasa? Work on innovative, high-impact projects with a team of top-tier professionals. Continuous learning opportunities and professional growth. Flexible work environment with a supportive company culture. Competitive salary and comprehensive benefits package. Free healthcare coverage. Here's a glimpse of what life at CodeVyasa looks like Life at CodeVyasa. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Your IT Future, Delivered DevOps Engineer (Snowflake) With a global team of 5800 IT professionals, DHL IT Services connects people and keeps the global economy running by continuously innovating and creating sustainable digital solutions. We work beyond global borders and push boundaries across all dimensions of logistics. You can leave your mark shaping the technology backbone of the biggest logistics company of the world. Our offices in Cyberjaya, Prague, and Chennai have earned #GreatPlaceToWork certification, reflecting our commitment to exceptional employee experiences. Digitalization. Simply delivered. At IT Services, we are passionate about technology. Our Express BI Complex Data Solutions team is continuously expanding. No matter your level of DevOps Engineering proficiency, you can always grow within our diverse environment. #DHL #DHLITServices #GreatPlace Grow together Timely delivery of DHL packages around the globe in a way that ensures customer data are secure is in the core of what we do. You will provide second/third level day-to-day operation support, and help investigate and resolve incidents or problem root cause analysis, which were not resolved in lower support levels. Sometimes, issues might get tricky and this is where cooperation on troubleshooting with other IT support teams and specialists will come into play. When it comes to firmware bugs, vulnerabilities and other issues related to our technologies, communicating with our vendors is key. Your responsibilities involve: Deploy and maintain critical applications on cloud-native microservices architecture Implement automation, effective monitoring, and infrastructure-as-code Deploy and maintain CI/CD pipelines across multiple environments Support and work alongside a cross-functional engineering team on the latest technologies Iterate on best practices to increase the quality & velocity of deployments Sustain and improve the process of knowledge sharing throughout the engineering team Have on call responsibilities in rotation with the engineering team You may need to work long hours in the run up to a product launch or during product-update periods. You may also need to be on call at these times, or to handle unexpected incidents. Last but not least, security technologies associated with Firewalls, Load Balancers, VPNs, Proxies, Azure and Google Cloud are all in your support league. Ready to embark on the journey? Here’s what we are looking for: As a DevOps Engineer, having Snowflake DB with Matillion knowledge is a huge plus. Very good knowledge of Teradata will also be an integral part of this role. You are a DevOps aficionado, therefore you have a good understanding of SQL, DevOps Tools as Jira, Jenkins, GitHub or similar CI/CD Tools. You are able to work independently, prioritize and organize your tasks under time and workload pressure. Working in a multinational environment, you can expect cross-region collaboration with teams around the globe, thus being advanced in spoken and written English will be certainly useful. An array of benefits for you: Hybrid work arrangements to balance in-office collaboration and home flexibility. Annual Leave: 42 days off apart from Public / National Holidays. Medical Insurance: Self + Spouse + 2 children. An option to opt for Voluntary Parental Insurance (Parents / Parent -in-laws) at a nominal premium covering pre existing disease. In House training programs: professional and technical training certifications. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra
Work from Office
Join us as a Solution Designer (Cloud Data Integration) at Barclays, where you will be responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Solution Designer (Cloud Data Integration) you should have experience with: Hands on experience to work with large scale data platforms & developing cloud solutions in AWS data platform with proven track record in driving business success. Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks. Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies. Hands on programming experience in python and PySpark. Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology. Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications. Proficiency in SQL and familiarity with database management systems (e.g., Oracle, SQL Server, MySQL, Kafka, AWS etc) Strong database fundamentals (Teradata, DynamoDB, Oracle etc.) Having designed and develop detail data models, schemas, and database designs Conceptual understanding of Data warehousing, Elastic platforms and Cloud must Some other highly valued skills may include: Multi cloud solution design preferred Proven experience in data modelling, database design, and data governance frameworks. Proficient in agile methodologies and project management tools (e.g., JIRA, Trello). Experience in business analysis or product ownership, preferably in a data analytics context. Strong knowledge of data analytics tools and methodologies Basic understanding of banking domain Excellent analytical, communication, and interpersonal skills. Excellent communication skills to interact with both technical and non-technical stakeholders. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery, driving consistency. Accountabilities Design and development of solutions as products that can evolve, meeting business requirements that align with modern software engineering practices and automated delivery tooling. This includes identification and implementation of the technologies and platforms. Targeted design activities that apply an appropriate workload placement strategy and maximise the benefit of cloud capabilities such as elasticity, serverless, containerisation etc. Best practice designs incorporating security principles (such as defence in depth and reduction of blast radius) that meet the Bank’s resiliency expectations. Solutions that appropriately balance risks and controls to deliver the agreed business and technology value. Adoption of standardised solutions where they fit. If no standard solutions fit, feed into their ongoing evolution where appropriate. Fault finding and performance issues support to operational support teams, leveraging available tooling. Solution design impact assessment in terms of risk, capacity and cost impact, inc. estimation of project change and ongoing run costs. Development of the requisite architecture inputs required to comply with the banks governance processes, including design artefacts required for architecture, privacy, security and records management governance processes. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Posted 1 month ago
0 - 4 years
0 Lacs
Chennai, Tamil Nadu
Work from Office
Join our “Business Support Team” at DHL Global Forwarding, Freight (DGFF) GSC – Global Service Centre! Job Title: Expert – Business Support (BS) Job Location: Chennai Are you dynamic and results-oriented with a passion for logistics? Join our high-performing Global Shared Services Team (GSC) at DHL Global Forwarding, Freight (DGFF); a Great Place to Work certified organization and one of the “Top 20 most admired Shared Services Organizations in 2022” by the independent global Shared Services & Outsourcing Network (SSON). We are the captive Shared Service Provider for DHL Global Forwarding and DHL Freight (DGFF). We are an organization of more than 4,600 colleagues complemented by approximately 500 virtual FTE (i.e., bots applied in process automation). Our colleagues are based across six service delivery centers in Mumbai, Chennai, Chengdu, Manila, Bogota & Budapest. You will interact with people from all over the world and get the chance to a truly international organization. This role is responsible for reviewing solution design documents, preparing and executing detailed test cases, and coordinating with internal and external teams to ensure data integrity in ETL processes. The role involves producing test reports, supporting business teams during UAT, and collaborating with stakeholders to achieve project objectives. Requires a strong foundation in ETL concepts, SQL proficiency, and effective communication skills. Key Responsibilities: Grasp the data warehousing process and framework to ensure ETL processes align with business requirements. Demonstrate expertise in SQL and Teradata, with strong scripting capabilities to manage and manipulate data effectively. Apply analytical skills to interpret data patterns, identify discrepancies, and ensure data integrity throughout the ETL process. Leverage understanding of the logistics industry to tailor testing strategies that reflect real-world scenarios and challenges. Employ a meticulous approach to testing, with a focus on identifying and resolving potential issues before they impact operations. Quickly adapt to new technologies and applications, embracing change and innovation within the ETL landscape. Lead discussions with stakeholders, facilitating effective communication between technical teams and business partners. Address and resolve issues proactively, contributing to continuous improvement and maintaining high service quality. Evaluate technical and business risks, communicating potential impacts to the management team effectively. Possess specialist-level knowledge of the SDLC, ensuring testing phases are integrated and efficient. Experience with BI tools like QlikSense and Power BI is advantageous, enriching data visualization and reporting capabilities. Assess Solution Design Documents for testing requirements, ensuring clarity and completeness before test execution. Design comprehensive test cases and execute them rigorously to validate ETL processes and data integrity. Work with external teams and vendors to align testing objectives and ensure seamless execution of test plans. Produce detailed test reports and support business teams during User Acceptance Testing to ensure successful implementation . Collaborate with business partners to enhance processes and identify best practices. Assist staff in resolving complex issues, maintain thorough process documentation, and ensure quality control. Educational qualifications: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field is essential. Work Experience: Candidates should possess a minimum of 2 to 4 years of hands-on ETL testing experience, ideally within a data warehousing environment. This experience should demonstrate a comprehensive understanding of database structures, theories, principles, and ETL practices. Proficiency in SQL scripting is mandatory, with a strong track record of crafting complex queries to validate data transformations, load processes, and ensure data integrity. Prior work experience in Agile and Scrum methodologies is required, highlighting adaptability, collaboration, and the ability to thrive in fast-paced development settings. Expert-level familiarity with test management tools like HP QC - ALM and Jira is expected, enabling efficient management, documentation, and tracking of all testing activities. Preferred Qualifications: Additional certifications related to SQL, data warehousing, ETL tools, and methodologies will be considered an advantage, indicating a dedication to ongoing professional development in the field. Experience within the logistics or a closely related industry is beneficial, providing valuable context that can enhance the relevance and effectiveness of testing strategies.
Posted 1 month ago
0 - 1 years
0 Lacs
Hyderabad, Telangana
Work from Office
Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role : WBD Integration team is seeking a Integration Administrator who will be responsible for providing technical expertise for supporting and enhancing the Integration suite (Informatica Power Center, IICS). We are specifically looking for a candidate with solid technical skills and experience in integrating ERP applications, SAAS, PAAS platforms such as SAP, Salesforce, Workday, etc., and data warehouses such as Teradata, Snowflake, and RedShift. Experience with the Informatica cloud platform would be ideal for this position. The candidate's primary job functions include but are not limited to the day-to-day configuration/development of the Informatica platform. The candidate must possess strong communication and analytical skills to effectively work with peers within the Enterprise Technology Group, various external partners/vendors, and business users to determine requirements and translate them into technical solutions. The candidate must have the ability to independently complete individual tasks in a dynamic environment to achieve departmental and company goals. Your Role Accountabilities: 4 to 5 years of experience as an DI Architect/Integration developer/Administrator 3 years of experience working with Informatica Repository, Designer, Workflow, and Monitor. 2 years of experience in modern DevOps/SRE practices such as CI/CD, performance monitoring & incident management 1 years of programming experience in writing Linux and Shell scripts 1 years of experience setting up, monitoring, and troubleshooting PowerCenter for SAP NetWeaver, SAP IDocs, and Salesforce Experience in Data modelling, preferable knowledge with Erwin, ER Studio. Should have experience in API development, including best practices, testing methods and deployment strategies. Experience in Informatica Intelligent Cloud Services like Cloud Data Integration, Cloud Application Integration, API Management, Cloud Integration Hub Experience in Cloud API Gateway configuration Experience in integration of on-premises applications with SaaS applications Experience in integrating ERP applications such as SAP, Salesforce, etc. Expertise in translating business requirements to project design, development, and execution. Resolving Platform environment issues and applying EBF’s, Patches and Hot Fixes. Automation of Backups, Purge, Build scripts, startups, and shutdown scripts. Should be well versed in setting up of Disaster Recovery (DR) environment across multi regions and experience in Datacenter Migration and application upgrades. Providing 24*7 application/platform support and involved in preparing the RCA and taking the approval from client and implementing all the PBE tasks. Team player, multitasker, excellent communication skills (convey highly technical information into business terms, clear email communications), ability to mentor team members. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do As a part of BCG X A&A team, you will work closely with consulting teams on a diverse range of advanced analytics topics. You will have the opportunity to leverage analytical methodologies to deliver value to BCG's Consulting (case) teams and Practice Areas (domain) through providing analytics subject matter expertise, and accelerated execution support. You will collaborate with case teams to gather requirements, specify, design, develop, deliver and support analytic solutions serving client needs. You will provide technical support through deeper understanding of relevant data analytics solutions and processes to build high quality and efficient analytic solutions. YOU'RE GOOD AT Working with case (and proposal) teams Acquiring deep expertise in at least one analytics topic & understanding of all analytics capabilities Defining and explaining expected analytics outcome; defining approach selection Delivering original analysis and insights to BCG teams, typically owning all or part of an analytics module and integrating with case teams Establishing credibility by thought partnering with case teams on analytics topics; drawing conclusions on a range of external and internal issues related to their module Communicating analytical insights through sophisticated synthesis and packaging of results (including PowerPoint presentation, Documents, dashboard and charts) with consultants, collects, synthesizes, analyses case team learning & inputs into new best practices and methodologies Build collateral of documents for enhancing core capabilities and supporting reference for internal documents; sanitizing confidential documents and maintaining a repository Able to lead workstreams and modules independently or with minimal supervision Ability to support business development activities (proposals, vignettes etc.) and build sales collateral to generate leads Team requirements: Guides juniors on analytical methodologies and platforms, and helps in quality checks Contributes to team's content & IP development Imparts technical trainings to team members and consulting cohort Technical Skills: Strong proficiency in statistics (concepts & methodologies like hypothesis testing, sampling, etc.) and its application & interpretation Hands-on data mining and predictive modeling experience (Linear Regression, Clustering (K-means, DBSCAN, etc.), Classification (Logistic regression, Decision trees/Random Forest/Boosted Trees), Timeseries (SARIMAX/Prophet)etc. Strong experience in at least one of the prominent cloud providers (Azure, AWS, GCP) and working knowledge of auto ML solutions (Sage Maker, Azure ML etc.) At least one tool in each category; Programming language - Python (Must have), (R Or SAS OR PySpark), SQL (Must have) Data Visualization (Tableau, QlikView, Power BI, Streamlit) , Data management (using Alteryx, MS Access, or any RDBMS) ML Deployment tools (Airflow, MLflow Luigi, Docker etc.) Big data technologies ( Hadoop ecosystem, Spark) Data warehouse solutions (Teradata, Azure SQL DW/Synapse, Redshift, BigQuery etc,) Version Control (Git/Github/Git Lab) MS Office (Excel, PowerPoint, Word) Coding IDE (VS Code/PyCharm) GenAI tools (OpenAI, Google PaLM/BERT, Hugging Face, etc.) Functional Skills: Expertise in building analytical solutions and delivering tangible business value for clients (similar to the use cases below) Price optimization, promotion effectiveness, Product assortment optimization and sales force effectiveness, Personalization/Loyalty programs, Labor Optimization CLM and revenue enhancement (segmentation, cross-sell/up-sell, next product to buy, offer recommendation, loyalty, LTV maximization and churn prevention) Communicating with confidence and ease: You will be a clear and confident communicator, able to deliver messages in a concise manner with strong and effective written and verbal communication. What You'll Bring Bachelor/Master's degree in a field linked to business analytics, statistics or economics, operations research, applied mathematics, computer science, engineering, or related field required; advanced degree preferred At least 2-4 years of relevant industry work experience providing analytics solutions in a commercial setting Prior work experience in a global organization, preferably in a professional services organization in data analytics role Demonstrated depth in one or more industries not limited to but including Retail, CPG, Healthcare, Telco etc Prior work experience in a global organization, preferably in a professional services organization in data analytics role to join our ranks. #BCGXjob Who You'll Work With Our data analytics and artificial intelligence professionals mix deep domain expertise with advanced analytical methods and techniques to develop innovative solutions that help our clients tackle their most pressing issues. We design algorithms and build complex models out of large amounts of data. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a Senior Test Automation Engineer at Barclays, responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Senior Test Automation Engineer you should have experience with: Hands on Test Automation with deep understanding of Software/QA Methodologies Understand requirements, user stories and able to prepare Test scope, test cases and execute the same Execute Non-functional requirements tests including performance, load, stress, scalability, and reliability Testing/ Automation Tools / frameworks like – Python, Pytest, BDD, TDD, Karate, Rest Assured, Performance Centre, Load runner etc.. Good understanding of tech stack as AWS, Kafka (Messaging Queues), Mongo DB, SQL, ETL and APIs CICD integration tools like Jenkins, TeamCity, GitLab etc. Collaborate closely with Dev/DevOps/BA teams. Unix commands, ETL architecture & Data Warehouse concepts Python Language (For Test Automation) – In-depth understanding of Data Structures: Lists, Tuples, Sets, Dictionaries. OOPS concepts, Data Frames, Lambda functions, Boto3, File handling, DB handling, debugging techniques etc Perform complex SQL queries/ joins to validate data transformations, migration and integrity across source and target systems. Test and defect management - Document Test results, defects, and track issues to resolutions using tool – Jira/ X-Ray Experience with at least one relational database – Oracle (Golden Gate services), MYSQL or SQL Server or Teradata Experience with at least one CICD tool for integrating Test Automation suits – Jenkins or TeamCity Some Other Highly Valued Skills May Include Functional Corporate Banking knowledge Good understanding of Agile methodologies Hands on experience with Gen AI models Good understanding of Snowflake, DBT & Pyspark Experience with BI tools like Tableau/ Power BI for visual data validations You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Show more Show less
Posted 1 month ago
8 - 10 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Senior Data Engineer - Google Cloud 7+ years direct experience working in Enterprise Data Warehouse technologies. 7+ years in a customer facing role working with enterprise clients. Experience with architecting, implementing and/or maintaining technical solutions in virtualized environments. Experience in design, architecture and implementation of Data warehouses, data pipelines and flows. Experience with developing software code in one or more languages such as Java, Python and SQL. Experience designing and deploying large scale distributed data processing systems with few technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Vertica, Netezza, Teradata, Tableau, Qlik or MicroStrategy. Customer facing migration experience, including service discovery, assessment, planning, execution, and operations. Demonstrated excellent communication, presentation, and problem-solving skills. Experience in project governance and enterprise. Mandatory Certifications Required Google Cloud Professional Cloud Architect Google Cloud Professional Data Engineer Mandatory skill sets-GCP Architecture/Data Engineering, SQL, Python Preferred Skill Sets-GCP Architecture/Data Engineering, SQL, Python Year of experience required-8-10 years Qualifications-B.E / B.TECH/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 13- 15 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Commercial Acumen, Communication, Data Analysis, Data cleansing and transformation, Data domain knowledge, Data Integration, Data Management, Data Manipulation, Data Sourcing, Data strategy and governance, Data Structures and Algorithms (Inactive), Data visualization and interpretation, Digital Security, Extract, transform and load, Group Problem Solving Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less
Posted 1 month ago
4 years
0 Lacs
Hyderabad, Telangana, India
Remote
Overview Enterprise Data Operations Sr Analyst - L08 Job Overview: As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities: Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling - documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications: 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Does the person hired for this job need to be based in a PepsiCo office, or can they be remote?: Employee must be based in a Pepsico office Primary Work Location: Hyderabad HUB-IND Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Senior Data Engineer will be Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 8- 12 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Show more Show less
Posted 1 month ago
3 - 10 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Roles & Responsibilities: Total Experience : 3 to 10 years Languages: Scala/Python 3.x File System: HDFS Frameworks: Spark 2.x/3.x (Batch/SQL API), Hadoop, Oozie/Airflow Databases: HBase, Hive, SQL Server, Teradata Version Control System: GitHub Other Tools: Zendesk, JIRA Mandatory Skill Sets Big Data, Python, Hadoop, Spark Preferred Skill Sets Big Data, Python, Hadoop, Spark Years Of Experience Required 3-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Big Data Optional Skills Python (Programming Language) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* Global Finance was set up in 2007 as a part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business and Enterprise Finance functions. The capabilities hosted include Legal Entity Controllership, General Accounting & Reconciliations, Regulatory Reporting, Operational Risk and Control Oversight, Finance Systems Support, etc. Our group supports the Finance function for the Consumer Banking, Wealth and Investment Management business teams across Financial Planning and Analysis, period end close, management reporting and data analysis. Job Description* The candidate will be responsible for delivering complex and time critical data mining and analytical request for the Consumer & Small Business Banking, lending products such as Credit Cards and in addition will be responsible for analysis of data for decision making by senior leadership. Candidate will be responsible for data management, data extraction and upload, data validation, scheduling & process automation, report preparation, etc. The individual will play a key role in the team responsible for FP&A data reporting, adhoc reporting & data requirements, data analytics & business analysis and would manage multiple projects in parallel by ensuring adequate understanding of the requirements and deliver data driven insights and solutions to complex business problems. These projects would be time critical which would require the candidate to comprehend & evaluate the strategic business drivers to bring in efficiencies through automation of existing reporting packages or codes. The work would be a mix of standard and ad-hoc deliverables based on dynamic business requirements. Technical competency is essential to build processes which ensure data quality and completeness across all projects / requests related to business. The core responsibility of this individual is process management to achieve sustainable, accurate and well controlled results. Candidate should have a clear understanding of the end-to-end process (including its purpose) and a discipline of managing and continuously improving those processes. Responsibilities* Preparation and maintenance of various KPI reporting (Consumer lending such as Credit Cards) including performing data or business driven deep dive analysis. Credit Cards rewards reporting, data mining & analytics. Understand business requirements and translate those into deliverables. Support the business on periodic and ad-hoc projects related to consumer lending products. Develop and maintain codes for the data extraction, manipulation, and summarization on tools such as SQL, SAS, Emerging technologies like Tableau and Alteryx. Design solutions, generate actionable insights, optimize existing processes, build tool-based automations, and ensure overall program governance. Managing and improve the work: develop full understanding of the work processes, continuous focus on process improvement through simplification, innovation, and use of emerging technology tools, and understanding data sourcing and transformation. Managing risk: managing and reducing risk, proactively identify risks, issues and concerns and manage controls to help drive responsible growth (ex: Compliance, procedures, data management, etc.), establish a risk culture to encourage early escalation and self-identifying issues. Effective communication: deliver transparent, concise, and consistent messaging while influencing change in the teams. Extremely good with numbers and ability to present various business/finance metrics, detailed analysis, and key observations to Senior Business Leaders. Requirements* Education* - Masters/Bachelor’s Degree in Information Technology/Computer Science/ MCA or MBA finance with 7-10 years of relevant work experience. Experience Range* 7-10 years of relevant work experience in data analytics, business analysis & financial reporting in banking or credit card industry. Exposure to Consumer banking businesses would be an added advantage. Foundational skills* Strong abilities in data extraction, data manipulation and business analysis and strong financial acumen. Strong computer skills, including MS excel, Teradata SQL, SAS and emerging technologies like Alteryx, Tableau. Prior Banking and Financial services industry experience, preferably Retail banking, and Credit Cards. Strong business problem solving skills, and ability to deliver on analytics projects independently, from initial structuring to final presentation. Strong communication skills (both verbal and written), Interpersonal skills and relationship management skills to navigate the complexities of aligning stakeholders, building consensus, and resolving conflicts. Proficiency in Base SAS, Macros, SAS Enterprise Guide Querying data from multiple source Experience in data extraction, transformation & loading using SQL/SAS. Proven ability to manage multiple and often competing priorities in a global environment. Manages operational risk by building strong processes and quality control routines. SQL: Querying data from multiple source Data Quality and Governance: Ability to clean, validate and ensure data accuracy and integrity. Troubleshooting: Expertise in debugging and optimizing SAS and SQL codes. Desired Skills Ability to effectively manage multiple priorities under pressure and deliver as well as being able to adapt to changes. Able to work in a fast paced, deadline-oriented environment. Stakeholder management Attention to details: Strong focus on data accuracy and documentation. Work Timings* 11:30 am to 8:30 pm (will require to stretch 7-8 days in a month to meet critical deadlines) Job Location* Mumbai/Gurugram Show more Show less
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* Global Finance was set up in 2007 as a part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business and Enterprise Finance functions. The capabilities hosted include Legal Entity Controllership, General Accounting & Reconciliations, Regulatory Reporting, Operational Risk and Control Oversight, Finance Systems Support, etc. Our group supports the Finance function for the Consumer Banking, Wealth and Investment Management business teams across Financial Planning and Analysis, period end close, management reporting and data analysis. Job Description* The candidate will be responsible for delivering complex and time critical data mining and analytical request for the Consumer & Small Business Banking, lending products such as Credit Cards and in addition will be responsible for analysis of data for decision making by senior leadership. Candidate will be responsible for data management, data extraction and upload, data validation, scheduling & process automation, report preparation, etc. The individual will play a key role in the team responsible for FP&A data reporting, adhoc reporting & data requirements, data analytics & business analysis and would manage multiple projects in parallel by ensuring adequate understanding of the requirements and deliver data driven insights and solutions to complex business problems. These projects would be time critical which would require the candidate to comprehend & evaluate the strategic business drivers to bring in efficiencies through automation of existing reporting packages or codes. The work would be a mix of standard and ad-hoc deliverables based on dynamic business requirements. Technical competency is essential to build processes which ensure data quality and completeness across all projects / requests related to business. The core responsibility of this individual is process management to achieve sustainable, accurate and well controlled results. Candidate should have a clear understanding of the end-to-end process (including its purpose) and a discipline of managing and continuously improving those processes. Responsibilities* Preparation and maintenance of various KPI reporting (Consumer lending such as Credit Cards) including performing data or business driven deep dive analysis. Credit Cards rewards reporting, data mining & analytics. Understand business requirements and translate those into deliverables. Support the business on periodic and ad-hoc projects related to consumer lending products. Develop and maintain codes for the data extraction, manipulation, and summarization on tools such as SQL, SAS, Emerging technologies like Tableau and Alteryx. Design solutions, generate actionable insights, optimize existing processes, build tool-based automations, and ensure overall program governance. Managing and improve the work: develop full understanding of the work processes, continuous focus on process improvement through simplification, innovation, and use of emerging technology tools, and understanding data sourcing and transformation. Managing risk: managing and reducing risk, proactively identify risks, issues and concerns and manage controls to help drive responsible growth (ex: Compliance, procedures, data management, etc.), establish a risk culture to encourage early escalation and self-identifying issues. Effective communication: deliver transparent, concise, and consistent messaging while influencing change in the teams. Extremely good with numbers and ability to present various business/finance metrics, detailed analysis, and key observations to Senior Business Leaders. Requirements* Education* - Masters/Bachelor’s Degree in Information Technology/Computer Science/ MCA or MBA finance with 7-10 years of relevant work experience. Experience Range* 7-10 years of relevant work experience in data analytics, business analysis & financial reporting in banking or credit card industry. Exposure to Consumer banking businesses would be an added advantage. Foundational skills* Strong abilities in data extraction, data manipulation and business analysis and strong financial acumen. Strong computer skills, including MS excel, Teradata SQL, SAS and emerging technologies like Alteryx, Tableau. Prior Banking and Financial services industry experience, preferably Retail banking, and Credit Cards. Strong business problem solving skills, and ability to deliver on analytics projects independently, from initial structuring to final presentation. Strong communication skills (both verbal and written), Interpersonal skills and relationship management skills to navigate the complexities of aligning stakeholders, building consensus, and resolving conflicts. Proficiency in Base SAS, Macros, SAS Enterprise Guide Querying data from multiple source Experience in data extraction, transformation & loading using SQL/SAS. Proven ability to manage multiple and often competing priorities in a global environment. Manages operational risk by building strong processes and quality control routines. SQL: Querying data from multiple source Data Quality and Governance: Ability to clean, validate and ensure data accuracy and integrity. Troubleshooting: Expertise in debugging and optimizing SAS and SQL codes. Desired Skills Ability to effectively manage multiple priorities under pressure and deliver as well as being able to adapt to changes. Able to work in a fast paced, deadline-oriented environment. Stakeholder management Attention to details: Strong focus on data accuracy and documentation. Work Timings* 11:30 am to 8:30 pm (will require to stretch 7-8 days in a month to meet critical deadlines) Job Location* Mumbai/Gurugram Show more Show less
Posted 1 month ago
4 - 9 years
0 - 3 Lacs
Chennai, Mumbai (All Areas)
Hybrid
Direct Responsibilities Analyze and interpret requirement & issues specifications received from Business analysts or production teams. Participate with analysts to ensure correct understanding and implementation of specifications. Work in collaboration with the analyst of the project to meet the clients expectations. Take charge of the development work according to the priorities defined by the product owner. Propose technical solutions adapted to the business needs, develop technical requirements. Design and develop IT solutions based on the specifications received. Packaging and deployment on non-production environments and monitor production releases & deployments Participate in the testing support (system, user acceptance, regression) . Bring a high level of quality in developments, in terms of maintainability, testability and performance. Participate in code reviews set up within the program. Contributing Responsibilities Participate in transversal, capability building efforts for the bank. Implementation of best practices, coding & development cultures Work closely as “one team” with all stakeholders jointly to provide high quality delivery Work on data-driven issues, innovative technologies (Java 17 or 21 with Quarkus Framework, Kubernetes, Kogito) and in the Finance & Risk functional areas. Technical & Behavioral Competencies Technical (Mandatory Skills): TRERADATA development (intermediate, expert) Knowledge of Linux shell SQL queries Unit Testing (Optional Skills): Management of temporality in Teradata Knowledge of Linux environment Behavior Skills Ability to work independently and collaborate as part of a team. Rigorous and disciplined, with deep attention to quality of work (software craftsmanship approach is welcome) Result oriented, ability to meet and respect deadlines. Curious, ability to learn and adapt to technological change. Good communication skills Excellent analytical and problem-solving skills
Posted 1 month ago
11 - 13 years
13 - 15 Lacs
Hyderabad
Work from Office
Position Overview Developer with 11 -1 3 ?years of strong design and developer experience to build? robust APIs and services using Java and Spring Boot , coupled with hands-on experience in data processing . H as knowledge and experience to design and implement scalable On Prem / C loud solutions that efficiently?manage and leverage large datasets . P roficient in Java / Spring Boot with demonstrated ability to integrate with different database s and other APIs and services while ensuring security and best practices are followed throughout the development lifecycle. ? Responsible for overall design and development of data integration code for the engineering team/asset. Responsible for providing technical knowledge on integration and ETL processes to build and establish coding standards. Create a working agreement withing the team and other stakeholders that are involved. Partner and align with Enterprise Architect, Product Owner, Production Support, Analyst, PVS Testing team, and data engineers to build solutions that conform to defined standards. Perform code reviews and implement suggest improvements. Responsibilities Design, develop, and maintain APIs using Java and Spring B oot and ensure efficient data exchange between applications. ?Implement API security measures including authentication, authorization, and rate limiting. Document API specifications and maintain API documentation for internal and external users. De velop integration with different data sources and other APIs / Web Services Develop integrations with IBM MQ and Kafka Develop / Maintain CI/CD pipelines Do performance evaluation and application tuning Monitor and troubleshoot application for stability and performance Well versed in design, development, and unit testing of ETL jobs that read and writes data from database tables, flat files, datasets, IBM MQs, Kafka topics, S3 files, etc. ? Carry out Data Profiling of Source data and generate logical data models (as required or applicable). ? Define, document, and complete System Requirements Specifications including Functional Requirements, Context Diagrams, Non-Functional Requirements, and Business Rules (as applicable for Sprints to be complete). Create Source-to-Target mapping documents as required and applicable. Support definition of business requirements including assisting the Product Owner in writing user stories and acceptance criteria for user stories. Support other scrum team members during the following activities (as required or applicable). Design of test scenarios and test cases. Develop and identify data requirements for Unit, Systems, and?Integration tests. Qualifications Required Skills Programming Languages ? Proficiency in Java. Web Development ? Experience with SOAP and RESTful services. Database Management ? Strong knowledge of SQL (Oracle). Version Control ? Expertise in using version control systems like Git. CI/CD ? Familiarity with CI/CD tools such as GitLab CI and Jenkins. Containerization & Orchestration ? Experience with Docker and OpenShift. Messaging Queues ? Knowledge of IBM MQ and Apache Kafka. Cloud Services ? Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Adept working experience in design and development of performance efficient ETL flows dealing with millions of rows in volume. Must have experience working in SAFE Agile Scrum project delivery model. Good at writing complex SQL queries to pull data out of RDBMS databases like Oracle, SQL Server, DB2, Teradata , etc. Good working knowledge of Unix scripts. Batch job scheduling software such as CA ESP. Experienced in using CI/CD methodologies . Required Experience & Education Must have 11 - 13 years of hands-on development of ETL jobs using IBM DataStage version 11 or higher. Experience managing and/or leading a team of developers. Working knowledge of Data Modelling, solution architecture, normalization, data profiling etc. Adherence to good coding practices, technical documentation, and must be a good team player. Desired Skills Analytical ThinkingAbility to break down complex problems and devise efficient solutions. DebuggingSkilled in identifying and fixing bugs in code and systems. Algorithm Design Proficiency in designing and optimizing algorithms. LeadershipProven leadership skills with experience mentoring junior engineers. CommunicationStrong verbal and written communication skills. TeamworkAbility to collaborate effectively with cross-functional teams. Time ManagementCompetence in managing time and meeting project deadlines. Education Bachelors degree in Computer Science , Software Engineering, or related field. A Master's degree is a plus. Certifications ? Relevant certifications in AWS a plus Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH) About Evernorth Health Services
Posted 1 month ago
11 - 13 years
13 - 15 Lacs
Hyderabad
Work from Office
Position Overview Developer with 11 -13 years of strong design and developer experience to build? robust APIs and services using Java and Spring Boot , coupled with hands-on experience in data processing . H as knowledge and experience to design and implement scalable On Prem / C loud solutions that efficiently?manage and leverage large datasets . P roficient in Java / Spring Boot with demonstrated ability to integrate with different database s and other APIs and services while ensuring security and best practices are followed throughout the development lifecycle. ? Create a working agreement withing the team and other stakeholders that are involved. Partner and align with Enterprise Architect, Product Owner, Production Support, Analyst, PVS Testing team, and data engineers to build solutions that conform to defined standards. Perform code reviews and implement suggest improvements. Responsibilities Design, develop, and maintain APIs using Java and Spring B oot and ensure efficient data exchange between applications. ?Implement API security measures including authentication, authorization, and rate limiting. Document API specifications and maintain API documentation for internal and external users. ? De velop integration with different data sources and other APIs / Web Services Develop integrations with IBM MQ and Kafka Develop / Maintain CI/CD pipelines Do performance evaluation and application tuning Monitor and troubleshoot application for stability and performance Carry out Data Profiling of Source data and generate logical data models (as required or applicable). ? Define, document, and complete System Requirements Specifications including Functional Requirements, Context Diagrams, Non-Functional Requirements, and Business Rules (as applicable for Sprints to be complete). Create Source-to-Target mapping documents as required and applicable. Support definition of business requirements including assisting the Product Owner in writing user stories and acceptance criteria for user stories. Support other scrum team members during the following activities (as required or applicable). Design of test scenarios and test cases. Develop and identify data requirements for Unit, Systems, and?Integration tests. Qualifications Required Skills Programming Languages Proficiency in Java. Web Development Experience with SOAP and RESTful services. Database Management Strong knowledge of SQL (Oracle). Version Control Expertise in using version control systems like Git. CI/CD Familiarity with CI/CD tools such as GitLab CI and Jenkins. Containerization & Orchestration ? Experience with Docker and OpenShift. Messaging Queues ? Knowledge of IBM MQ and Apache Kafka. Cloud Services ? Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Adept working experience in design and development of performance efficient ETL flows dealing with millions of rows in volume. Must have experience working in SAFE Agile Scrum project delivery model. Good at writing complex SQL queries to pull data out of RDBMS databases like Oracle, SQL Server, DB2, Teradata , etc. Good working knowledge of Unix scripts. Batch job scheduling software such as CA ESP. Experienced in using CI/CD methodologies . Required Experience & Education Must have 11 - 13 years of hands-on development experience Extensive e xperience developing and maintaining APIs Experience managing and/or leading a team of developers. Working knowledge of Data Modelling, solution architecture, normalization, data profiling etc. Adherence to good coding practices, technical documentation, and must be a good team player. Desired Skills Analytical ThinkingAbility to break down complex problems and devise efficient solutions. DebuggingSkilled in identifying and fixing bugs in code and systems. Algorithm Design Proficiency in designing and optimizing algorithms. LeadershipProven leadership skills with experience mentoring junior engineers. CommunicationStrong verbal and written communication skills. TeamworkAbility to collaborate effectively with cross-functional teams. Time ManagementCompetence in managing time and meeting project deadlines. Education Bachelors degree in Computer Science , Software Engineering, or related field. A Master's degree is a plus. Certifications Relevant certifications in AWS a plus Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH)
Posted 1 month ago
1 - 2 years
3 - 4 Lacs
Bengaluru
Work from Office
About Lowe s Lowe s Companies, Inc. (NYSE: LOW) is a FORTUNE 50 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. About the Team This team is responsible to perform quantitative analysis or dashboard building needed to help guide key business decisions. This includes applying knowledge of Lowes data concepts to the creation of relevant analytic designs and making sound, data driven business recommendations. Job Summary This role leverages multiple resources, advanced analytic methodologies, and data streams to support recommendations for business decisions and reporting solutions. With a focus specifically on Pro & Services, this role provides data capture capabilities to support analytics needs for all Pro & Services business areas. This role translates business needs to effective analytics specifications that provide metrics for analytic solutions across various initiatives. This individual executes analytic, reporting, and automation projects with minimal support while getting direction from manager and senior level staff to provide expertise in problem analysis, solution implementation, and ongoing opportunities in the assigned business area. To be successful, the individual in this role must have a fair understanding of analytical techniques and disparate data sources both internal and external, reporting tools and techniques. Roles & Responsibilities: Core Responsibilities: Responsible for providing area-specific business data analytics, development and deployment of necessary dashboards and reporting. Helps gather business requirements and translates into reporting solutions, analytic tools, and dashboards to deliver actionable data to end users. Synthesizes findings, prepares reports and presentations, and presents findings to management. Communicates data driven insights to leaders by preparing analyses using multiple data sources, translating findings into clear, understandable themes, identifying complete, consistent, and actional insights and recommendations. Develops, configures, and modifies database components within various computing environments by using various tools such as SQL and/or Power BI to access, manipulate, and present data. Years of Experience: 1-2 years of experience using analytic tools (e.g., SQL, Alteryx, Knime, SAS). 1-2 years of experience using data visualization tools (e.g., Power BI, Microstrategy, Tableau). 1-2 years of experience working with Enterprise level databases (e.g., Hadoop, Teradata, GCP, Oracle, DB2). Education Qualification & Certifications (optional) Required Minimum Qualifications : Bachelor s degree in business administration, Finance, Mathematics, or Related Fields and 1 Years Related Experience OR master s degree in business administration, Finance, Mathematics, or Related Fields. Skill Set Required Primary Skills (must have) Hands on experience in analytical tools (e.g., SQL, Alteryx, Knime, SAS). Experience using data visualization tools (e.g., Power BI, Microstrategy, Tableau). Experience working with Enterprise level databases (e.g., Hadoop, Teradata, GCP, Oracle, DB2). Secondary Skills (desired) Basic understanding of the retail/home improvement industry
Posted 1 month ago
4 - 9 years
40 - 45 Lacs
Hyderabad
Work from Office
!. Description As a Cloud Data Platform Engineer, you will be responsible for leading all aspects of a database platform. This would include either database design, database security, DR strategy, develop standard processes, new feature evaluations or analyze workloads to identify optimization opportunities at a system and application level. You will be driving automation efforts to effectively manage the database platform, and build self service solutions for users. You will also be partnering with development teams, product managers and business users to review the solution design being deployed and provide recommendations to optimize and tune. This role will also address any platform wide performance and stability issues. Were looking for an individual who loves to take challenges, takes on problems with imaginative solutions, works well in collaborative teams to build and support a large Enterprise Data Warehouse. 4+ years of experience in database technologies like Snowflake (preferred), Teradata, BigQuery or Redshift. Demonstrated ability working with Advanced SQL. Experience handling DBA functions, DR strategy, data security, governance, associated automation and tooling for a database platform. Experience with object oriented programming in Python or Java. Analyze production workloads and develop strategies to run Snowflake database with scale and efficiency. Experience in performance tuning, capacity planning, managing cloud spend and utilization. Experience with SaaS/PaaS enterprise services on GCP/AWS or Azure is a plus Familiarity with in-memory database platforms like SingleStore is a plus Experience with Business intelligence (BI) platforms like Tableau, Thought-Spot and Business Objects is a plus Good communication and personal skills: ability to interact and work well with members of other functional groups in a project team and a strong sense of project ownership. Education & Experience Bachelor s Degree in Computer Science Engineering or IT from a reputed school
Posted 1 month ago
8 - 11 years
14 - 19 Lacs
Thiruvananthapuram
Work from Office
We are looking for a skilled professional with 8 to 11 years of industry experience to lead our migration of data analytics environment from Teradata to Snowflake, focusing on performance and reliability. The ideal candidate will have strong technical expertise in big data engineering and hands-on experience with Snowflake. ### Roles and Responsibility Lead the migration of data analytics environments from Teradata to Snowflake, emphasizing performance and reliability. Design and deploy big data pipelines in a cloud environment using Snowflake Cloud DW. Develop and migrate existing on-prem ETL routines to Cloud Services. Collaborate with senior leaders to understand business goals and contribute to workstream delivery. Design and optimize model codes for faster execution. Work with cross-functional teams to ensure seamless integration of data analytics solutions. ### Job Requirements Minimum 8 years of experience as an Architect on Analytics solutions. Strong technical experience with Snowflake, including modeling, schema, and database design. Experience integrating with third-party tools, ETL, and DBT tools. Proficiency in programming languages such as Java, Scala, or Python. Excellent communication skills, both written and verbal, with the ability to communicate complex technical concepts effectively. Flexible and proactive working style with strong personal ownership of problem resolution. A computer science graduate or equivalent is required.
Posted 1 month ago
3 - 7 years
10 - 14 Lacs
Hyderabad
Hybrid
Job Title: Data Expert Teradata & SQL Location: Hyderabad Job Summary: We are seeking an experienced Data Expert with deep expertise in Teradata and SQL to join our data team. In this role, you will be responsible for designing, developing, optimizing, and maintaining scalable data solutions within our Teradata ecosystem. You will work closely with data engineers, analysts, and business stakeholders to enable high-quality, performant data pipelines and queries that support analytics and decision-making across the organization. Key Responsibilities: Develop, optimize, and troubleshoot complex Teradata SQL queries to support business intelligence and analytics initiatives. Design and implement efficient data models , including star and snowflake schemas in Teradata. Monitor, analyze, and tune query performance using Teradata EXPLAIN , DBQL , and statistics collection . Collaborate with cross-functional teams to understand data requirements and translate them into scalable data solutions. Develop and maintain ETL/ELT processes using BTEQ, FastLoad, MultiLoad, or Teradata Parallel Transporter (TPT). Implement data quality checks , validation logic, and ensure consistency across environments. Document data pipelines, definitions, and data lineage for governance and compliance. Support data migration or integration projects involving Teradata and other databases (e.g., Oracle, Snowflake, SQL Server). Automate repetitive tasks using scripting (Shell, Python) and workflow orchestration tools (e.g., Airflow, Control-M). Required Skills and Qualifications: 3+ years of hands-on experience with Teradata SQL in a production environment. Strong knowledge of Teradata architecture , indexing strategies (PI, PPI), and query tuning. Experience with Teradata utilities : BTEQ, FastLoad, MultiLoad, and TPT. Ajith D ajith.d@cgi.com
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Teradata is a popular data warehousing platform that is widely used by businesses in India. As a result, there is a growing demand for skilled professionals who can work with Teradata effectively. Job seekers in India who have expertise in Teradata have a wide range of opportunities available to them across different industries.
These cities are known for their thriving tech industries and have a high demand for Teradata professionals.
The average salary range for Teradata professionals in India varies based on experience levels. Entry-level roles can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.
In the field of Teradata, a typical career path may involve progressing from roles such as Junior Developer to Senior Developer, and eventually to a Tech Lead position. With experience and skill development, professionals can take on more challenging and higher-paying roles in the industry.
In addition to Teradata expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL tools, and data warehousing concepts. Strong analytical and problem-solving skills are also essential for success in Teradata roles.
As you prepare for interviews and explore job opportunities in Teradata, remember to showcase your skills and experience confidently. With the right preparation and determination, you can land a rewarding role in the dynamic field of Teradata in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.