Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2. Engage with delivery team to ensure right solution is proposed to the customer a. Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demos testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b. Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3. Build domain expertise and contribute to knowledge repository Engage and interact with other BAs to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver / No . / Performance Parameter / Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter Mandatory Skills: ServiceNow - Platform Core.
Posted 1 month ago
1.0 - 3.0 years
3 - 6 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool.
Posted 1 month ago
1.0 - 3.0 years
3 - 6 Lacs
Chennai
Work from Office
Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Chennai
Remote
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Agra
Remote
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Kanpur
Remote
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
12.0 - 18.0 years
50 - 55 Lacs
Pune
Work from Office
Extensive, hands on knowledge of Data Modelling, Data Architecture and Data Lineage Broad knowledge of banking products , Required Candidate profile financial products (ie international trade, credit physical data modelling preferably with cloud GCP
Posted 1 month ago
8.0 - 10.0 years
3 - 7 Lacs
Kolkata
Work from Office
Experience - 8-10 years Responsibilities : - Data Gathering/Data Analysis/Data Modelling/Data Cleansing/Data formatting - AS IS and TO-BE business process analysis and process modelling including end-to-end data flows - Authoring data migration plan/cutover plan - Support change management activities - Supporting solution development team with data insights as required Mandatory Skills/Experience : - HCM project experience including upgrade/improvement projects - People/HR data & process knowledge and experience - Oracle HCM Cloud skills and experience - BI Dashboard skills and experience - ability to create/update - Demonstrated experience developing cutover plans/data migration plans - Ability to present complex data in easily consumable format to executive level Desirable/Preferred Skills : - R-Studio skills and experience to code level - Knowledge of coding languages - SQL, Python, R Ideal Candidate : - Strong customer facing skills - high standard of verbal/written communication skills - Self-starter with ability to work independently to lead data related work within large HCM project. - Ability to work across multiple initiatives simultaneously - Flexibility in work hours with global project team and business - Ability to work efficiently and effectively via remote work preferably working on US/Canada time zone). Location: Remote- Delhi / NCR,Bangalore/Bengaluru,Hyderabad/Secunderabad,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 1 month ago
2.0 - 6.0 years
12 - 22 Lacs
Hyderabad, Bengaluru
Hybrid
What we ask Experience: 2-6 years of experience in Data Engineering roles. Technical Skills: Proficiency in SQL, Python and Big data technologies (PySpark, Hive, Hadoop) Strong understanding of data pipeline Familiarity with data visualization tools. Good understanding of ETL pipelines Good experience in Data modelling Communication Skills: Ability to communicate complex technical concepts. Strong collaborative and team-oriented mindset We would be excited if you have Excellent communication and interpersonal skills Ability to meet deadlines and manage project delivery Excellent report-writing and presentation skills Critical thinking and problem-solving capabilities Whats in it for you? A Happy Workplace! We create an environment where everyone feels welcome and we are more than just co-workers, sharing an informal and fun workplace. Our teams are highly adaptive, and our dynamic culture pushes everyone to create success in all dimensions. Lucrative Packages and Perks At Indium we recognize your talent and offer competitive salaries better than the market standards. In addition to appraisals, rewards, and recognition programs conducted regularly, we have performance bonuses, sign-offs, and joining bonuses to value your contributions and success for yourself and Indium. Your Health is Priority for Us! A healthy and happy workforce is important for us, hence we ensure that you and your dependents are covered under our Medical Insurance Policy. From 1:1 counselling session for your mental well-being to fun filled fitness initiatives we ensure you stay healthy and happy! Skill Up to Scale Up We believe in continuous learning as part of our core values and hence we provide excellent training initiatives along with access to our mainspring learning platform, Indium Academy to ensure you keep yourself equipped with the necessary technical skills for greater success. Work-Life Balance With Flexi hybrid working culture and 5-day work week structure, and lots of fun destressing initiatives we create a positive and relaxed environment to work with!
Posted 1 month ago
3.0 - 5.0 years
15 - 20 Lacs
Bengaluru
Remote
Workdays : Mon to Fri Job Description: The Data Scientist is responsible for contributing to the development of the strategy for Analytics, Reporting and Metrics building in line with overall Business Information Services strategy leveraging visualization tools/business intelligence platforms. Your role will include the following:- Coordinate with different functional teams to implement models and monitor outcomes. Develop processes and tools to monitor and analyze model performance and data accuracy. Leveraging business intelligence tools i.e Spotfire for analysis and review. Develop custom data models and algorithms to apply to data sets. Providing specialist support for development, interpretation and application of machine learning models Applying analytical and statistical methods to solve identified use cases in an agile manner using various data sources and analytical tools Work with key stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. Liaising, advising and negotiating with key customer groups on requirements and proposing innovative solutions to meet their data modelling requirements Responsible for the development and implementation of capabilities to make provisioning of routine requests more routine, structured and efficient Contributing to initiatives that enable faster and more effective data modelling/analysis and provisioning Ensuring systems and code, documentation are inspection ready What Required: Were looking for someone with 3-5 years of experience manipulating data sets and building statistical models, has a Masters or PHD in Statistics, Mathematics, Computer Science. Experience using statistical computer languages Python & SQL (R Considered) to manipulate data and draw insights from large data sets. Experience working with and creating data architectures. Experience of clinical data and domains (Experience working within Clinical Trials or the wider Pharmaceutical industry) preferable but not necessary Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications. Excellent written and verbal communication skills for coordinating across teams. A drive to learn and master new technologies and techniques.
Posted 1 month ago
12.0 - 18.0 years
50 - 55 Lacs
Hyderabad
Work from Office
Extensive, hands on knowledge of Data Modelling, Data Architecture and Data Lineage Broad knowledge of banking products , financial products (ie international trade, credit physical data modelling preferably with cloud GCP
Posted 1 month ago
7.0 - 9.0 years
5 - 9 Lacs
Pune
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 month ago
4.0 - 6.0 years
6 - 10 Lacs
Hyderabad
Work from Office
What you will do Role Description: The Scrum Master is a leader and coach who facilitates Scrum events and processes, delivering value for the Global Quality Data and Analytics product team. The role involves facilitating communication and collaboration among teams, ensuring alignment with the program vision, managing risks and dependencies, and driving relentless improvement. The Scrum master helps adapt SAFe to the organizations needs, standardizing and documenting practices. The role requires a solid background in the end-to-end software development lifecycle and a Scaled Agile practitioner, coupled with leadership and transformation experience. Roles & Responsibilities: Lead and manage product delivery using agile frameworks and techniques. Align with Agile values, such as prioritizing individuals and interactions over processes and tools. Ensure day-to-day operations by automating tasks, monitoring system health, and minimizing downtime through incident response Capture the voice of the customer to define business processes and product needs Collaborate with Global Quality business stakeholders, architects, and engineering teams to prioritize release scopes and refine the product backlog Lead and facilitate breakdown of Epics into Features and sprint-sized user stories and participate in backlog reviews with the development team Clearly express features in user stories/requirements so all team members and stakeholders understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Advise SAFe events, including PI Planning, Scrum of Scrums, and Inspect & Adapt workshops Stay focused on software development to ensure it meets requirements, providing proactive feedback to stakeholders Help develop and maintain a product roadmap that clearly outlines the planned features and enhancements, timelines, and achievements Identify and manage risks associated with the systems, requirement validation, and user acceptance Develop & maintain documentation of configurations, processes, changes, communication plans and training plans for end users Collaborate with geographically dispersed teams, including those in the US and other international locations Foster a culture of collaboration, innovation, and continuous improvement Leverage agile tools such as Jira / Jira Align, Smartsheets and Confluence What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree with 4 - 6 years of experience in Computer Science/Information Systems experience with Agile Software Development methodologies OR Bachelors degree with 6 - 8 years of experience in Computer Science/Information Systems experience with Agile Software Development methodologies OR Diploma with 10 - 12 years of experience in Computer Science/Information Systems experience with Agile Software Development methodologies Must-Have Skills: Strong understanding of Agile methodologies, particularly the Scaled Agile Framework (SAFe) Prior experience with Agile project management tools, such as Jira, Confluence and Jira Align Experience in guiding teams through Agile events and ensuring adherence to SAFe practices and behaviors Preferred Qualifications: Experience in managing product features for PI planning and developing product roadmaps and user journeys Experience maintaining SaaS (software as a system) solutions and COTS (Commercial off the shelf) solutions Technical thought leadership Able to communicate technical or complex subject matters in business terms Familiarity with GxP computer system validation Experience with Project planning/ Data modelling tools such as Smartsheet, Lucid, Miro, etc. Professional Certifications: Certified SAFe Scrum Master or similar (preferred) ITIL (preferred) Soft Skills: Able to work under minimal supervision Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills
Posted 1 month ago
5.0 - 10.0 years
9 - 13 Lacs
Pune
Work from Office
Key Responsibilities Lead the design, development, and deployment of Oracle Fusion SaaS solutions, particularly in Supply Chain and Finance. Build and maintain integrations using Oracle Integration Cloud (OIC), REST/SOAP web services, and middleware tools. Customize and extend Fusion applications using BI Publisher, OTBI, FBDI, HDL, and ADF. Translate business requirements into technical specifications and detailed solution designs. Support the full development lifecycle including change management, documentation, testing, and deployment. Participate in formal design/code reviews and ensure adherence to coding standards. Collaborate with IT service providers to ensure quality, performance, and scalability of outsourced work. Provide Level 3 support for critical technical issues. Stay current with emerging Oracle technologies and contribute to continuous improvement initiatives. Experience 5+ years of hands-on experience in Oracle Fusion SaaS development and technical implementation. Proven experience with Oracle Fusion Supply Chain and Finance modules. Intermediate level of relevant work experience (3-5 years minimum). Skills & Technical Expertise Strong knowledge of Oracle SaaS architecture, data models, and PaaS extensions. Proficiency in Oracle Integration Cloud (OIC), REST/SOAP APIs. Experience with Oracle tools: BI Publisher, OTBI, FBDI, HDL, ADF. Ability to analyze and revise existing systems for improvements. Familiarity with SDLC, version control, and automation tools. Qualifications Bachelor s degree in Computer Science, Information Technology, Business, or a related field (or equivalent experience). Relevant certifications in Oracle Fusion or related technologies are a plus. Compliance with export controls or sanctions regulations may be required. Core Competencies Customer Focus - Builds strong relationships and delivers customer-centric solutions. Global Perspective - Applies a broad, global lens to problem-solving. Manages Complexity - Navigates complex information to solve problems effectively. Manages Conflict - Resolves conflicts constructively and efficiently. Optimizes Work Processes - Continuously improves processes for efficiency and effectiveness. Values Differences - Embraces diverse perspectives and cultures. Technical Competencies Solution Design & Configuration - Designs scalable, secure, and maintainable solutions. Solution Functional Fit Analysis - Evaluates how well components interact to meet business needs. Solution Modeling & Validation Testing - Creates models and tests solutions to ensure they meet requirements. Performance Tuning & Data Modeling - Optimizes application and database performance.
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram
Work from Office
Primary Skills: Oracle Database, Postgres, Database design Secondary Skills: Data modelling, Performance Tuning, ETL processes, Automating Backup and Purging Processes Skill Justification Database Designing, Data Modelling, and Core Component Implementation: These are fundamental skills for a DBA. Database designing involves creating the structure of the database, data modelling is about defining how data is stored, accessed, and related, and core component implementation ensures that the database is set up correctly and efficiently. Data Integration and Relational Data Modelling: Data integration is crucial for combining data from different sources into a unified view, which is essential for accurate reporting and analysis. Relational data modelling helps in organizing data into tables and defining relationships, which is a core aspect of managing relational databases. Optimization and Performance Tuning: Optimization and performance tuning are critical for ensuring that the database runs efficiently. This involves analyzing and improving query performance, indexing strategies, and resource allocation to prevent bottlenecks and ensure smooth operation. Automating Backup and Purging Processes: Automating backup and purging processes is vital for data integrity and storage management. Regular backups protect against data loss, while purging old or unnecessary data helps maintain database performance and manage storage costs.
Posted 1 month ago
9.0 - 13.0 years
40 - 45 Lacs
Kolkata
Work from Office
Knowledge of advance excel & programs like Power BI with DAX, SQL, Tableau, KNIME Python, AIML & data modelling/ETL, Data preparation & derive insights Background of business, finance & merchandise planning, Presentation skills Aptitude for descriptive analysis.
Posted 1 month ago
12.0 - 15.0 years
8 - 14 Lacs
Pune
Work from Office
Experience in GCP, Big Data, Scripting - Python, Java, Ruby Data modelling DevOpsExp.- 12+ Years
Posted 1 month ago
7.0 - 9.0 years
14 - 15 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
7+ years of Azure Data Engineering experience with proficiency in SQL and at least one programming language (e.g., Python) for data manipulation and scripting: Strong experience with PySpark, ADF, Databricks, Data Lake, and SQL Preferable experience with MS Fabric. Proficiency in data warehousing concepts and methodologies and implementation. Strong knowledge of Azure Synapse and Azure Databricks. Hands-on experience with data warehouse platforms and ETL tools (e.g Apache Spark). Deep understanding of data modelling principles, data integration techniques, and data governance best practices. Preferrable experience with Power BI, domain knowledge of Finance, Procurement, Human Capital. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote.
Posted 1 month ago
6.0 - 8.0 years
8 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background #ContractDetails Role: Senior Data Engineer Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, India Duration: 6 Months Email to Apply: navaneeta@suzva.com Contact: 9032956160
Posted 1 month ago
3.0 - 5.0 years
5 - 8 Lacs
Hyderabad
Work from Office
Understanding of Spark core concepts like RDD s, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python / Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/ Informatica/Ab Initio.
Posted 1 month ago
6.0 - 11.0 years
7 - 11 Lacs
Hyderabad
Hybrid
DBA with a strong focus on Primary Skills: Oracle Database, Postgres, Database design Secondary Skills: Data modelling, Performance Tuning, ETL processes, Automating Backup and Purging Processes Skill Justification Database Designing, Data Modelling, and Core Component Implementation: These are fundamental skills for a DBA. Database designing involves creating the structure of the database, data modelling is about defining how data is stored, accessed, and related, and core component implementation ensures that the database is set up correctly and efficiently. Data Integration and Relational Data Modelling: Data integration is crucial for combining data from different sources into a unified view, which is essential for accurate reporting and analysis. Relational data modelling helps in organizing data into tables and defining relationships, which is a core aspect of managing relational databases. Optimization and Performance Tuning: Optimization and performance tuning are critical for ensuring that the database runs efficiently. This involves analyzing and improving query performance, indexing strategies, and resource allocation to prevent bottlenecks and ensure smooth operation. Automating Backup and Purging Processes: Automating backup and purging processes is vital for data integrity and storage management. Regular backups protect against data loss, while purging old or unnecessary data helps maintain database performance and manage storage costs.
Posted 1 month ago
8.0 - 13.0 years
6 - 9 Lacs
Hyderabad
Work from Office
Data Engineering Team As a Lead Data Engineer for India, you will be accountable for leading the technical aspects of product engineering by being hands on, working on the enhancement, maintenance and support of the product on which your team is working, within your technology area. You will be responsible for your own hands-on coding, provide the design thinking and design solutions, ensuring the quality of your teams output, representing your team in product-level technical forums and ensuring your team provides technical input to and aligns with the overall product road-map. How will you make an impact? You will work with Engineers in other technology areas to define the overall technical direction for the product on alignment with Groups technology roadmap, standards and frameworks, with product owners and business stakeholders to shape the product's delivery roadmap and with support teams to ensure its smooth operation. You will be accountable for the overall technical quality of the work produced by India that is in line with the expectation of the stakeholders, clients and Group. You will also be responsible for line management of your team of Engineers, ensuring that they perform to the expected levels and that their career development is fully supported. Key responsibilities o Produce Quality Code o Code follows team standards, is structured to ensure readability and maintainability and goes through review smoothly, even for complex changes o Designs respect best practices and are favourably reviewed by peers o Critical paths through code are covered by appropriate tests o High-level designs / architectures align to wider technical strategy, presenting reusable APIs where possible and minimizing system dependencies o Data updates are monitored and complete within SLA o Technical designs follow team and group standards and frameworks, is structured to ensure reusability, extensibility and maintainability and goes through review smoothly, even for complex changes o Designs respect best practices and are favourably reviewed by peers o High-level designs / architectures align to wider technical strategy, presenting reusable APIs where possible and minimizing system dependencies o Estimates are consistently challenging, but realistic o Most tasks are delivered within estimate o Complex or larger tasks are delivered autonomously o Sprint goals are consistently achieved o Demonstrate commitment to continuous improvement of squad activities o The product backlog is consistently well-groomed, with a responsible balance of new features and technical debt mitigation o Other Engineers in the Squad feel supported in their development o Direct reports have meaningful objectives recorded in Quantium's Performance Portal, and understand how those objectives relate to business strategy o Direct reports' career aspirations are understood / documented, with action plans in place to move towards those goals o Direct reports have regular catch-ups to discuss performance, career development and their ongoing happiness / engagement in their role o Any performance issues are identified, documented and agreed, with realistic remedial plans in place o Squad Collaboration o People Management o Produce Quality Technical Design o Operate at high level of productivity Key activities Build technical product/application engineering capability in the team by that is in line with the Groups technical roadmap, standards and frameworks Write polished code, aligned to team standards, including appropriate unit / integration tests Review code and test cases produced by others, to ensure changes satisfy the associated business requirement, follow best practices, and integrate with the existing code-base Provide constructive feedback to other team members on quality of code and test cases Collaborate with other Lead / Senior Engineers to produce high-level designs for larger pieces of work Validate technical designs and estimates produced by other team members Merge reviewed code into release branches, resolving any conflicts that arise, and periodically deploy updates to production and non-production environments Troubleshoot production problems and raise / prioritize bug tickets to resolve any issues Proactively monitor system health and act to report / resolve any issues Provide out of hours support for periodic ETL processes, ensuring SLAs are met Work with business stakeholders and other leads to define and estimate new epics Contribute to backlog refinement sessions, helping to break down each epic into a collection of smaller user stories that will deliver the overall feature Work closely with Product Owners to ensure the product backlog is prioritized to maximize business value and manage technical debt Lead work breakdown sessions to define the technical tasks required to implement each user story Contribute to sprint planning sessions, ensuring the team takes a realistic but challenging amount of work into each sprint and each team member will be productively occupied Contribute to the teams daily stand-up, highlighting any delays or impediments to progress and proposing mitigation for those issues Contribute to sprint review and sprint retro sessions, to maintain a culture of continuous improvement within the team Coach / mentor more junior Engineers to support their continuing development Set and periodically review delivery and development objectives for direct reports Identify each direct reports longer-term career objectives and, as far as possible, factor this into work assignments Hold fortnightly catch-ups with direct reports to review progress against objectives, assess engagement and give them the opportunity to raise concerns about the product or team Work through the annual performance review process for all team members Conduct technical interviews as necessary to recruit new Engineers The superpowers youll be bringing to the team: 8+ years of experience in design, develop, and implement end-to-end data solutions (storage, integration, processing, access) in Google Cloud Platform (GCP) or similar cloud platforms. 2. Strong experience with SQL 3. Values delivering high-quality, peer-reviewed, well-tested code 4. Create ETL/ELT pipelines that transform and process terabytes of structured and unstructured data in real-time 5. Knowledge of DevOps functions and to contribute to CI / CD pipelines 6. Strong knowledge of data warehousing and data modelling and techniques like dimensional modelling etc 7. Strong hands-on experience with BigQuery/Snowflake, Airflow/Argo, Dataflow, Data catalog, VertexAI, Pub/Sub etc or equivalent products in other cloud platforms 8. Solid grip over programming languages like Python or Scala 9. Hands on experience in manipulating SPARK at scale with true in-depth knowledge of SPARK API 10. Experience working with stakeholders and mentoring experience for juniors in the team is good to have 11. Recognized as a go-to person for high-level designs and estimations 12. Experience working with source control tools (GIT preferred) with good understanding of branching / merging strategies 13. Experience in Kubernetes and Azure will be an advantage 14. Understanding of GNU/Linux systems and Bash/scripting 15. Bachelors degree in Computer Science, Information Technology or a related discipline 16. Comfortable working in a fast moving, agile development environment 17. Excellent problem solving / analytical skills 18. Good written / verbal communication skills 19. Commercially aware, with the ability to work with a diverse range of stakeholders 20. Enthusiasm for coaching and mentoring junior engineers 21. Experience in lading teams, including line management responsibilities What could your Quantium Experience look like? Working at Quantium will allow you to challenge your imagination. You will get to solve complex problems using rigor, precision and by asking great questions but it also means you can think big, outside the box and push your problem-solving skills to the max. By joining the Quantium team, youll get to: Forge your path: So many of our team have moved around different teams or offices. Youll be in the drivers seat, and we empower you to make your career your own. Find your kind: Embrace diversity and connect with your tribe (think foodies, dog lovers, readers, or runners). Make an impact: Leave your mark. Your contributions resonate, regardless of your role or rank. On top of the Quantium Experience, you will enjoy a range of great benefits that go beyond the ordinary. Some of these include: Flexible work arrangements : Achieve work life balance at your own pace with hybrid and flexible work arrangements. Continuous learning : Our vision is empowering analytics talent to thrive. The Analytics Community fosters the development of individuals, thought leadership and technical excellence at Quantium through building strong connections, fostering collaboration, and co-creation of best practice. Remote working : Embrace the opportunity to work outside of your assigned home location for up to 2 months every year.
Posted 1 month ago
5.0 - 7.0 years
18 - 20 Lacs
Hyderabad, Bengaluru
Hybrid
Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with MUST 4+ YEARS hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark, Python, and working with modern data engineering tools in cloud environments such as AWS. Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments.
Posted 1 month ago
6.0 - 10.0 years
15 - 25 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Lead Development: Design, develop, and maintain .NET applications using C# and .NET Core/.NET Framework, with a focus on scalability and performance.
Posted 2 months ago
3.0 - 5.0 years
5 - 7 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Do 1. Customer requirements gathering and engagement Interface and coordinate with client engagement partners to understand the RFP/ RFI requirements Detail out scope documents, functional & non-functional requirements, features etc ensuring all stated and unstated customer needs are captured Construct workflow charts and diagrams, studying system capabilities, writing specification after thorough research and analysis of customer requirements Engage and interact with internal team - project managers, pre-sales team, tech leads, architects to design and formulate accurate and timely response to RFP/RFIs Understand and communicate the financial and operational impact of any changes Periodic cadence with customers to seek clarifications and feedback wrt solution proposed for a particular RFP/ RFI and accordingly instructing delivery team to make changes in the design Empower the customers through demonstration and presentation of the proposed solution/ prototype Maintain relationships with customers to optimize business integration and lead generation Ensure ongoing reviews and feedback from customers to improve and deliver better value (services/ products) to the customers 2. Engage with delivery team to ensure right solution is proposed to the customer a. Periodic cadence with delivery team to: Provide them with customer feedback/ inputs on the proposed solution Review the test cases to check 100% coverage of customer requirements Conduct root cause analysis to understand the proposed solution/ demo/ prototype before sharing it with the customer Deploy and facilitate new change requests to cater to customer needs and requirements Support QA team with periodic testing to ensure solutions meet the needs of businesses by giving timely inputs/feedback Conduct Integration Testing and User Acceptance demos testing to validate implemented solutions and ensure 100% success rate Use data modelling practices to analyze the findings and design, develop improvements and changes Ensure 100% utilization by studying systems capabilities and understanding business specifications Stitch the entire response/ solution proposed to the RFP/ RFI before its presented to the customer b. Support Project Manager/ Delivery Team in delivering the solution to the customer Define and plan project milestones, phases and different elements involved in the project along with the principal consultant Drive and challenge the presumptions of delivery teams on how will they successfully execute their plans Ensure Customer Satisfaction through quality deliverable on time 3. Build domain expertise and contribute to knowledge repository Engage and interact with other BAs to share expertise and increase domain knowledge across the vertical Write whitepapers/ research papers, point of views and share with the consulting community at large Identify and create used cases for a different project/ account that can be brought at Wipro level for business enhancements Conduct market research for content and development to provide latest inputs into the projects thereby ensuring customer delight Deliver No. Performance Parameter Measure 1. Customer Engagement and Delivery Management PCSAT, utilization % achievement, no. of leads generated from the business interaction, no. of errors/ gaps in documenting customer requirements, feedback from project manager, process flow diagrams (quality and timeliness), % of deal solutioning completed within timeline, velocity generated. 2. Knowledge Management No. of whitepapers/ research papers written, no. of user stories created, % of proposal documentation completed and uploaded into knowledge repository, No of reusable components developed for proposal during quarter Mandatory Skills: HC - Payor. Experience: 3-5 Years.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough