Jobs
Interviews

1753 Aggregation Jobs - Page 28

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

20 - 23 Lacs

Chennai, Tamil Nadu, India

On-site

We are hiring a detail-oriented and technically strong ETL Test Engineer to validate, verify, and maintain the quality of complex ETL pipelines and Data Warehouse systems . The ideal candidate will have a solid understanding of SQL , data validation techniques , regression testing , and Azure-based data platforms including Databricks . Key Responsibilities Perform comprehensive testing of ETL pipelines, ensuring data accuracy and completeness across systems. Validate Data Warehouse (DWH) objects including fact and dimension tables. Design and execute test cases and test plans for data extraction, transformation, and loading processes. Conduct regression testing to validate enhancements and ensure no breakage of existing data flows. Work with SQL to write complex queries for data verification and backend testing. Test data processing workflows in Azure Data Factory and Databricks environments. Collaborate with developers, data engineers, and business analysts to understand requirements and raise defects proactively. Perform root cause analysis for data-related issues and suggest improvements. Create clear and concise test documentation, logs, and reports. Required Technical Skills Strong knowledge of ETL testing methodologies and tools Excellent skills in SQL (joins, aggregation, subqueries, performance tuning) Hands-on experience with Data Warehousing and data models (Star/Snowflake) Experience in test case creation, execution, defect logging, and closure Proficient in regression testing, data validation, data reconciliation Working knowledge of Azure Data Factory (ADF), Azure Synapse, and Databricks Experience with test management tools like JIRA, TestRail, or HP ALM Nice to Have Exposure to automation testing for data pipelines Scripting knowledge in Python or PySpark Understanding of CI/CD in data testing Experience with data masking, data governance, and privacy rules Qualifications Bachelor’s degree in Computer Science, Information Systems, or related field 3+ years of hands-on experience in ETL/Data Warehouse testing Excellent analytical and problem-solving skills Strong attention to detail and communication skills Skills: regression,azure,data reconciliation,test management tools,data validation,azure databricks,etl testing,data warehousing,dwh,etl pipeline,test case creation,azure data factory,test cases,etl tester,regression testing,sql,databricks

Posted 1 month ago

Apply

3.0 years

20 - 23 Lacs

Gurugram, Haryana, India

On-site

We are hiring a detail-oriented and technically strong ETL Test Engineer to validate, verify, and maintain the quality of complex ETL pipelines and Data Warehouse systems . The ideal candidate will have a solid understanding of SQL , data validation techniques , regression testing , and Azure-based data platforms including Databricks . Key Responsibilities Perform comprehensive testing of ETL pipelines, ensuring data accuracy and completeness across systems. Validate Data Warehouse (DWH) objects including fact and dimension tables. Design and execute test cases and test plans for data extraction, transformation, and loading processes. Conduct regression testing to validate enhancements and ensure no breakage of existing data flows. Work with SQL to write complex queries for data verification and backend testing. Test data processing workflows in Azure Data Factory and Databricks environments. Collaborate with developers, data engineers, and business analysts to understand requirements and raise defects proactively. Perform root cause analysis for data-related issues and suggest improvements. Create clear and concise test documentation, logs, and reports. Required Technical Skills Strong knowledge of ETL testing methodologies and tools Excellent skills in SQL (joins, aggregation, subqueries, performance tuning) Hands-on experience with Data Warehousing and data models (Star/Snowflake) Experience in test case creation, execution, defect logging, and closure Proficient in regression testing, data validation, data reconciliation Working knowledge of Azure Data Factory (ADF), Azure Synapse, and Databricks Experience with test management tools like JIRA, TestRail, or HP ALM Nice to Have Exposure to automation testing for data pipelines Scripting knowledge in Python or PySpark Understanding of CI/CD in data testing Experience with data masking, data governance, and privacy rules Qualifications Bachelor’s degree in Computer Science, Information Systems, or related field 3+ years of hands-on experience in ETL/Data Warehouse testing Excellent analytical and problem-solving skills Strong attention to detail and communication skills Skills: regression,azure,data reconciliation,test management tools,data validation,azure databricks,etl testing,data warehousing,dwh,etl pipeline,test case creation,azure data factory,test cases,etl tester,regression testing,sql,databricks

Posted 1 month ago

Apply

3.0 years

20 - 23 Lacs

Pune, Maharashtra, India

On-site

We are hiring a detail-oriented and technically strong ETL Test Engineer to validate, verify, and maintain the quality of complex ETL pipelines and Data Warehouse systems . The ideal candidate will have a solid understanding of SQL , data validation techniques , regression testing , and Azure-based data platforms including Databricks . Key Responsibilities Perform comprehensive testing of ETL pipelines, ensuring data accuracy and completeness across systems. Validate Data Warehouse (DWH) objects including fact and dimension tables. Design and execute test cases and test plans for data extraction, transformation, and loading processes. Conduct regression testing to validate enhancements and ensure no breakage of existing data flows. Work with SQL to write complex queries for data verification and backend testing. Test data processing workflows in Azure Data Factory and Databricks environments. Collaborate with developers, data engineers, and business analysts to understand requirements and raise defects proactively. Perform root cause analysis for data-related issues and suggest improvements. Create clear and concise test documentation, logs, and reports. Required Technical Skills Strong knowledge of ETL testing methodologies and tools Excellent skills in SQL (joins, aggregation, subqueries, performance tuning) Hands-on experience with Data Warehousing and data models (Star/Snowflake) Experience in test case creation, execution, defect logging, and closure Proficient in regression testing, data validation, data reconciliation Working knowledge of Azure Data Factory (ADF), Azure Synapse, and Databricks Experience with test management tools like JIRA, TestRail, or HP ALM Nice to Have Exposure to automation testing for data pipelines Scripting knowledge in Python or PySpark Understanding of CI/CD in data testing Experience with data masking, data governance, and privacy rules Qualifications Bachelor’s degree in Computer Science, Information Systems, or related field 3+ years of hands-on experience in ETL/Data Warehouse testing Excellent analytical and problem-solving skills Strong attention to detail and communication skills Skills: regression,azure,data reconciliation,test management tools,data validation,azure databricks,etl testing,data warehousing,dwh,etl pipeline,test case creation,azure data factory,test cases,etl tester,regression testing,sql,databricks

Posted 1 month ago

Apply

6.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

It's a fintech company that provides an account aggregation service. This means that it helps connect and gather financial data from different sources like your bank accounts, credit cards, investments, and loans into one unified platform. Think of it as a smart tool that collects all your financial information from various institutions and organizes it in one place. Role Overview: We are seeking a hands-on Test Engineer with 6-7 years of experience, capable of developing and implementing robust testing strategies. The ideal candidate should demonstrate expertise in automating API, web, and mobile tests, ensuring the delivery of high-quality software products. Key Responsibilities: • Design, develop, and execute detailed test plans and strategies. • Automate API tests to ensure comprehensive backend validation. • Develop and maintain automated UI tests for both web and mobile applications. • Identify, document, and track defects, performing root cause analysis to support resolutions. • Collaborate effectively with software development teams to verify feature accuracy and functionality. • Continuously enhance test automation frameworks, scripts, and coverage. Required Skills and Experience: • 6-7 years of hands-on experience in software testing and automation. • Proficiency in API testing tools such as Postman, RestAssured, Karate, or similar frameworks. • Solid expertise in web UI testing frameworks such as Selenium, Cypress, or Playwright. • Strong experience in mobile automation testing tools such as Appium, Espresso (Android), or XCUITest (iOS). • Proficient in Behavior-Driven Development (BDD) tools like Cucumber and familiarity with Gherkin language for test scenario writing. • Experience in developing test plans, execution strategies, and comprehensive reporting mechanisms. • Familiarity with agile methodologies, CI/CD pipelines, and related technologies.

Posted 1 month ago

Apply

8.0 - 15.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

About the Role: We are looking for a seasoned legal professional to join a leading FinTech company’s Legal & Governance team. This is a non-litigation role, focusing on contract management, and legal advisory work across digital payments and financial services. Key Responsibilities: Drafting, review, negotiation and standardization of various new, bespoke and existing templates for meeting ongoing business and product requirements. Highly diligent and professional acumen to foresee and manage Contractual, Governance, Regulatory and Reputational risks and its group companies. Training around important legal developments to relevant employees on periodic basis. Deliverables: Negotiates, drafts and advises on legal agreements, business SLAs and end to end contract management to the Business and Product teams. Should be able to work independently, guide and manage legal team for achieving desired results and counsel cross-functional teams in a fast paced and diverse environment. Experience Required: 8 to 15 years of post-qualification experience in Contract Management, Product Advisory and Governance either as in-house counsel or external/law firm counsel or both. Good background of FinTech business, Companies Act, 2013, RBI (PAPG and KYC) & FEMA regulations, Data Privacy Laws (including DPDP Act, 2023 and GDPR), Contract Management related work in Digital Payments, BBPS and financial services space. Experience in developing and implementing processes for efficient and streamlining Legal operations especially for Contract Management including optimizing contracting process, TAT expectations, stamping, e-sign, repository access etc. Strong drafting, communication and influence building skills with past record of leading teams. Exposure to a variety of contracts – Complex IT/ Software Development, e-commerce marketplace, payments, real estate to variety of vendors, suppliers, employment, service and partner contracts. Skills Required: Well versed with legislative and regulatory changes, improve contract versions/templates, particularly with IT and Data Protection laws, Contract Act, RBI PAPG and BBPS guidelines. Drive high stake and complex contracts including negotiations and contract finalization. Committed and ready to work under constraints/pressure. Ability to work in a highly dynamic environment and collaborate with various internal and external stakeholders. Drafting, review and advisory experience in Documentation and Policy making work related to Digital Payments (like payment aggregation, gateway, TPAPs, e-commerce marketplaces). Develop commercial understanding of the businesses and product lines for risk assessment of mandatory legal & compliance requirements. Evaluate impacts of any new changes in laws and regulations, anticipate issues and proactively solve for open gaps, eliminate or mitigate risks. Good analytical, research and multi-tasking abilities to provide time-sensitive support on legal and regulatory issues in connection with business operations. High degree of professional ethics, integrity, analytical, interpersonal and judgement skills. Qualification Required: Law graduate (3/5-year program) from a reputed institution. Company Secretary qualification (preferred but not mandatory). Strong academic background (1st class in SSC and HSC). Relevant Industry: Banking / Online Payment Gateway (PSO) / Fintech registered with RBI or NPCI / Financial services (NBFC or Insurance) Location: Andheri (W), Mumbai Kindly Note: This is a strictly non-litigation profile. Only candidates currently based in Mumbai should apply (No relocation or remote option).

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Role - Network Admin Years of Experience - 5 to 7 years Location - Pune & Chennai Aruba Wireless Controller, Cisco, Aruba, HP, Juniper LAN Switches Aruba Specialist Aruba, Juniper LAN Switches hands-on Network/Infrastructure experience Experience with design, implementation & troubleshooting experience in Aruba wirless controllers, Aruba Clearpass, Aruba Airwave & Aruba/HP Switches Troubleshooting experience in Juniper Switches Experience in Infoblox(DHCP/DNS/IPAM) Ability to design, deploy and troubleshoot IP and wireless networks, which includes enterprise IP networking, IP network security, authentication, certificates, remote access, network access controls, and IP network management Experience and understanding of LAN/WAN architectures and designs & troubleshooting Working knowledge in the operation of Ethernet LAN switching protocols and standards including, VLANs, VLAN aggregation, Ether Channel, PVLANs, Spanning Tree & Rapid Spanning Tree, 802.1Q. Experience performing configuration and troubleshooting of Aruba & Juniper switching platforms Deploying Wired and Wireless Authentication, Authorization and Accounting Exposure to network management tools Experience on maintaining network performance Exposure to network security Knowledge on ITIL standards- Incident, Change & Problem Management Exposure to Problem Solving, handling problem tickets etc. Should be Certified in relevant Technologies (CCNP(R&S) Good communication & interpersonal skills

Posted 1 month ago

Apply

5.0 years

0 Lacs

India

On-site

Company Description Irisidea is a software products and solutions company that focuses on digital transformation through Data Engineering, AI/ML, IoT, Mobile Applications, and B2C/B2B eCommerce. The company was founded in 2010 and is known for its innovative work in real-time data streaming, processing, and analytics using open-source technologies. Irisidea operates delivery centers in Ontario (Canada) and Bangalore (India), with partners in the USA and Russia. Role Description : MEAN Stack Developer (with Firebase & GCP) This is a full-time role for an Angular & Firebase Developer at Irisidea. The developer will be responsible for front-end and back-end web development, software development, utilizing technologies such as JavaScript and Angular to create innovative solutions. The role involves collaborating with the team on various projects to drive digital transformation for organizations. Job Title: MEAN Stack Developer Experience: 5+ Years Location: Bangalore (On-site/Hybrid as per company policy) Notice Period: Immediate to 15 Days Key Responsibilities: Develop and maintain scalable web applications using the MEAN stack. Design RESTful APIs and integrate with front-end components. Work with Firebase & Firestore for real-time data operations. Implement Google login authentication and related security flows. Collaborate with product, design, and QA teams to deliver high-quality software. Optimize applications for performance and scalability. Participate in code reviews, unit testing, and documentation. Must-Have Skills: Angular (latest versions) – component-based architecture, RxJS. NodeJS – asynchronous programming, event-driven architecture. ExpressJS – middleware integration, REST API development. RESTful API – design and integration. Firebase & Firestore – real-time databases, cloud functions. Google Login Integration – OAuth2 implementation for secure login. Good to Have: Google Cloud Platform (GCP) – App Engine, Cloud Functions, Firebase hosting. Single Sign-On (SSO) – Integration with third-party identity providers. MongoDB – schema design, aggregation pipelines, indexing.

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. Job Description The communications networking industry is undergoing transformative change and Ciena is leading the charge with a fresh wave of innovation uniting optical and packet technologies. As a trusted partner to over 1,000 customers in 60+ countries, Ciena builds networks that fundamentally changes the way our customers compete. Ciena’s Routing and Switching business plays an instrumental role in our strategy and is bringing products and solutions to market that win with the pillars of speed – service delivery, agility, and assurance. What We’re Looking For We believe in the power of people. We are a network strategy and technology company that is motivated by making a difference in people lives – their productivity, their creativity and comfort. That’s why our engineers design and implement innovative technologies that allow people to do great things. With large customer base and proven capabilities - Ciena’s Routing and Switching software team is focussing on design/development of Next generation Routing/Aggregation Platforms with enhanced Routing and MPLS capabilities to take advantage of worldwide market opportunities as Service Providers, Enterprises and Datacentre virtualize their networks, unify their legacy networks and prepare for 5G. We are looking for motivated Software Engineers with penchant for learning to join our team. What will you do at Ciena? The candidate will be involved in the design and development of software for our next generation Networking Platforms in Routing and Switching. This position will focus on software design and implementation. As part of a software development team, you will be introduced to new technologies, design practices, development and debugging techniques that will help you grow both professionally and personally. It is a unique opportunity to apply and enhance your engineering knowledge and skills in an environment dedicated to maintaining industry superiority. The candidate will be expected to provide basic assessment of own work including effort estimation, status reporting etc. The candidate will get enough exposure for continuous learning and development of skills in the related domain. Required Skills Good understanding of Software Design principles and “C” programming. Theoretical/practical exposure to networking protocols (Ethernet switching, Layer-3 routing, MPLS) is desired Good design, testing, debugging and documentation skills. Excellent verbal and written communication skills. Self-starter with a strong “can do” belief, who can work in a dynamic environment with little supervision. Education And Experience Bachelor’s/ Master’s degree in Computer Science or Electronics and Communication. Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Responsibilities Description Build & maintain data aggregation processes and dashboards, creating interactive and compelling visual, numeric, and written summaries. Ensures data governance policies are followed. Train & coach business users on BI tools. Promote data literacy and a self-serve culture across the enterprise. Analyze business intelligence data to inform business and product decisions Blend historical data from available industry reports, public information, field reports or purchased sources as input to analyses. Identifying and analyzing industry/geographic trends and competitor market strategies and monitoring current/potential customer trends. Partners with other areas of the business (e.g., Marketing, Logistics, Customer Service, etc.) to model the outcome of implementing potential business strategies Qualifications Bachelor's Degree preferred; Associate's degree required. Bachelor's Degree + 4-6 years analytics, consulting, project management, or equivalent experience preferred Applies advanced knowledge of job area typically obtained through advanced education and work experience. Managing projects / processes, working independently with limited supervision. Coaching and reviewing the work of lower level professionals. Problems faced are difficult and sometimes complex. Typically uses dedicated business intelligence applications and/or cloud services (e.g., Domo, Looker, PowerBI, Qlik, Tableau, SiSense), but may also leverage the reporting capabilities of existing ERP, CRM and/or database software vendors (e.g., Oracle BI, Salesforce, SAP Business Objects, Pentaho, SSRS) Additional Information Power BI, SQL, Python, SAS Programming(Good to have)

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Gurgaon, Haryana, India AXA XL is the Property & Casualty risk division of AXA, created from the acquisition of XL Catlin in 2018. We partner with mid-sized to multinational clients to take their business further, with more than 30 lines of business to offer solutions for existing and emerging risks. With 9,000+ colleagues based in approximately 30 countries; we are able to serve clients in more than 200 countries. The Natural Catastrophe Data Analyst will be a part of the AXA XL team, fully dedicated to supporting AXA Group Risk Management in the effective assessment and management of natural catastrophe risks. This role involves collaborating with various stakeholders to enhance our understanding of natural hazards and their impact on AXA’s portfolio. You will play a key role in supporting the continuous improvement of the CAT modeling process, leveraging data analysis and programming skills to contribute to the development of methodologies that assess natural catastrophe exposure. Your work will directly support decision-making processes related to risk management and Group Reinsurance strategies, ensuring that AXA remains resilient in the face of evolving risks. What You’ll Be DOING What will your essential responsibilities include? Acquire a comprehensive understanding of AXA's natural catastrophe risk landscape across all entities, including exposure data, geographical, and financial information. Assist in the continuous improvement of the CAT modeling process, including evaluating and gaining insights into CAT modeling software and their releases (e.g., RMS, Verisk, ICM) while contributing to the development of methodologies for risk assessment. Participate in the review of ICM codes developed by the GRM team and perform sensitivity analyses to evaluate and quantify model uncertainties. Support the development of sophisticated natural hazard perils (floods, hailstorms, wildfires, windstorms, tropical cyclones, earthquakes, landslides…) from hazard to vulnerability modules. Collaborate with all entities involved in natural catastrophe risk to gather relevant underwriting data and effectively communicate modeling developments and outcomes. Contribute to the creation of data products that support exposure management, including building exposure databases and implementing disaggregation methodologies, as well as engaging in ‘Live CAT’ (live assessment of CAT events) activities. Participate in the development of internal IT applications for exposure management (Clymene), CAT modeling (NOTUS), aggregation (RHEA), and GIS visualization (GAIA), ensuring seamless integration and functionality across platforms. Contribute to the optimization of Group Reinsurance strategies to ensure adequate coverage of natural catastrophe risks. Assist in conducting technical analyses and portfolios to support risk management initiatives related to natural catastrophes. Contribute to technical documentation and adopt best knowledge management practices to comply with regulatory environments (Solvency II, GDPR). You will report to Vice President, Insurance Operations. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Bachelor’s degree or equivalent in a relevant field (e.g., Actuary, Data Science, Applied Mathematics). Relevant years of working experience, preferably in risk management or related fields. Robust Data and programming skills: collection, transformation and mining (R/Python, SQL). Excellent interest in Climate Change, Natural Hazard science and business fields related to (re)insurance. Expertise in natural catastrophe modeling, statistics, and risk theory. Business English - fluent (spoken and written). Desired Skills And Abilities Ability to work collaboratively in a team environment with effective interpersonal skills. Analytical skills / Ability to evolve in a diverse technical and operational environment. Effective organizational skills and attention to detail. Flexibility and adaptability to changing circumstances. Demonstrated rigor, motivation, autonomy, and proactivity. Curiosity and open-mindedness to explore new ideas and approaches. Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You Will Be Doing... The Commercial Data & Analytics - Impact Analytics team is part of the Verizon Global Services (VGS) organization.The Impact Analytics team addresses high-impact, analytically driven projects focused within three core pillars: Customer Experience, Pricing & Monetization, Network & Sustainability. In this role, you will analyze large data sets to draw insights and solutions to help drive actionable business decisions. You will also apply advanced analytical techniques and algorithms to help us solve some of Verizon’s most pressing challenges. Use your analysis of large structured and unstructured datasets to draw meaningful and actionable insights Envision and test for corner cases. Build analytical solutions and models by manipulating large data sets and integrating diverse data sources Present the results and recommendations of statistical modeling and data analysis to management and other stakeholders. Identify data sources and apply your knowledge of data structures, organization, transformation, and aggregation techniques to prepare data for in-depth analysis Deeply understand business requirements and translate them into well-defined analytical problems, identifying the most appropriate statistical techniques to deliver impactful solutions. Assist in building data views from disparate data sources which powers insights and business cases Apply statistical modeling techniques / ML to data and perform root cause analysis and forecasting Develop and implement rigorous frameworks for effective base management. Collaborate with cross-functional teams to discover the most appropriate data sources, fields which caters to the business needs Design modular, reusable Python scripts to automate data processing Clearly and effectively communicate complex statistical concepts and model results to both technical and non-technical audiences, translating your findings into actionable insights for stakeholders. What We’re Looking For... You have strong analytical skills, and are eager to work in a collaborative environment with global teams to drive ML applications in business problems, develop end to end analytical solutions and communicate insights and findings to leadership. You work independently and are always willing to learn new technologies. You thrive in a dynamic environment and are able to interact with various partners and cross functional teams to implement data science driven business solutions. You Will Need To Have Bachelor’s degree in computer science or another technical field or four or more years of work experience Four or more years of relevant work experience Proficiency in SQL, including writing queries for reporting, analysis and extraction of data from big data systems (Google Cloud Platform, Teradata, Spark, Splunk etc) Curiosity to dive deep into data inconsistencies and perform root cause analysis Programming experience in Python (Pandas, NumPy, Scipy and Scikit-Learn) Experience with Visualization tools matplotlib, seaborn, tableau, grafana etc A deep understanding of various machine learning algorithms and techniques, including supervised and unsupervised learning Understanding of time series modeling and forecasting techniques Even better if you have one or more of the following: Experience with cloud computing platforms (e.g., AWS, Azure, GCP) and deploying machine learning models at scale using platforms like Domino Data Lab or Vertex AI Experience in applying statistical ideas and methods to data sets to answer business problems. Ability to collaborate effectively across teams for data discovery and validation Experience in deep learning, recommendation systems, conversational systems, information retrieval, computer vision Expertise in advanced statistical modeling techniques, such as Bayesian inference or causal inference. Excellent interpersonal, verbal and written communication skills. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.

Posted 1 month ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

Key Responsibilities Land Acquisition Legal Support: Provide end-to-end legal support for land acquisition, including title verification, due diligence, and documentation. Review and ensure clear titles, identify encumbrances, and advise on land regularization and conversion. Handle legal aspects of land aggregation, land pooling, and negotiation with landowners. Documentation & Contract Management Draft, vet, and finalize all land-related legal documents such as Sale Deeds, Agreement to Sell, Power of Attorney, Joint Development Agreements (JDAs), Lease Agreements, and MOUs. Ensure proper registration and stamping of documents in compliance with Tamil Nadu Registration Act and Indian Stamp Act. Regulatory & Local Compliance Ensure strict adherence to local land laws, RERA (TN RERA), Tamil Nadu Land Reforms Act, and applicable municipal and panchayat regulations. Liaise with revenue department, Sub-Registrar offices, and other government authorities for approvals and clearances. Litigation & Dispute Resolution Manage and represent the company in land-related disputes, including civil suits, land acquisition claims, encroachment cases, and tenancy issues. Coordinate with external legal counsel and law firms for representation and case management in Tamil Nadu courts. Due Diligence & Risk Management Conduct legal due diligence and land audits prior to acquisition or development. Assess legal risks in land deals and proactively mitigate potential liabilities. Work closely with the land acquisition, liaison, and project teams to align legal processes with business needs. Build relationships with local legal experts, notaries, revenue officials, and community leaders. Qualifications LLB / LLM from a recognized law school. 1-5 years of legal experience, with significant experience in land acquisition and property law in Tamil Nadu. In-depth knowledge of Tamil Nadu land laws, real estate rules, and local registration processes. Fluent in English and Tamil (both written and spoken). Strong experience handling revenue records, patta/chitta, EC, and other land records. Preferred Attributes Sound knowledge of land registration and mutation processes in Tamil Nadu. Experience In Both Urban And Rural Land Acquisitions. Strong negotiation skills with landowners and government authorities. High level of integrity, local awareness, and ability to manage on-ground legal complexities. Skills: rera compliance,negotiation,due diligence,risk management,land acquisition,contract management,legal due diligence,dispute resolution,litigation,documentation,title verification,real estate law,knowledge of tamil nadu land laws

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You Will Be Doing... The Commercial Data & Analytics - Impact Analytics team is part of the Verizon Global Services (VGS) organization.The Impact Analytics team addresses high-impact, analytically driven projects focused within three core pillars: Customer Experience, Pricing & Monetization, Network & Sustainability. In this role, you will analyze large data sets to draw insights and solutions to help drive actionable business decisions. You will also apply advanced analytical techniques and algorithms to help us solve some of Verizon’s most pressing challenges. Use your analysis of large structured and unstructured datasets to draw meaningful and actionable insights Envision and test for corner cases. Build analytical solutions and models by manipulating large data sets and integrating diverse data sources Present the results and recommendations of statistical modeling and data analysis to management and other stakeholders. Identify data sources and apply your knowledge of data structures, organization, transformation, and aggregation techniques to prepare data for in-depth analysis Deeply understand business requirements and translate them into well-defined analytical problems, identifying the most appropriate statistical techniques to deliver impactful solutions. Assist in building data views from disparate data sources which powers insights and business cases Apply statistical modeling techniques / ML to data and perform root cause analysis and forecasting Develop and implement rigorous frameworks for effective base management. Collaborate with cross-functional teams to discover the most appropriate data sources, fields which caters to the business needs Design modular, reusable Python scripts to automate data processing Clearly and effectively communicate complex statistical concepts and model results to both technical and non-technical audiences, translating your findings into actionable insights for stakeholders. What We’re Looking For... You have strong analytical skills, and are eager to work in a collaborative environment with global teams to drive ML applications in business problems, develop end to end analytical solutions and communicate insights and findings to leadership. You work independently and are always willing to learn new technologies. You thrive in a dynamic environment and are able to interact with various partners and cross functional teams to implement data science driven business solutions. You Will Need To Have Bachelor’s degree in computer science or another technical field or four or more years of work experience Four or more years of relevant work experience Proficiency in SQL, including writing queries for reporting, analysis and extraction of data from big data systems (Google Cloud Platform, Teradata, Spark, Splunk etc) Curiosity to dive deep into data inconsistencies and perform root cause analysis Programming experience in Python (Pandas, NumPy, Scipy and Scikit-Learn) Experience with Visualization tools matplotlib, seaborn, tableau, grafana etc A deep understanding of various machine learning algorithms and techniques, including supervised and unsupervised learning Understanding of time series modeling and forecasting techniques Even better if you have one or more of the following: Experience with cloud computing platforms (e.g., AWS, Azure, GCP) and deploying machine learning models at scale using platforms like Domino Data Lab or Vertex AI Experience in applying statistical ideas and methods to data sets to answer business problems. Ability to collaborate effectively across teams for data discovery and validation Experience in deep learning, recommendation systems, conversational systems, information retrieval, computer vision Expertise in advanced statistical modeling techniques, such as Bayesian inference or causal inference. Excellent interpersonal, verbal and written communication skills. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.

Posted 1 month ago

Apply

5.0 years

3 - 9 Lacs

Gurgaon

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Apply advanced statistical techniques, data science methodologies, and AI practices, including generative AI using GPT models Design, develop, and implement cutting-edge generative AI models and systems Perform data mining, cleaning, and aggregation to prepare datasets for analysis Manage and query data while ensuring high standards of data quality and integrity Communicate complex insights through compelling data storytelling and visualizations using Power BI Leverage ML Ops practices to deploy and manage machine learning models in production environments Collaborate in an agile team setting, actively participating in sprint planning, reviews, and retrospectives Adhere to company policies, employment terms, and evolving business directives, including flexibility in work location, team assignments, and work arrangements Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: 5+ years of experience in data science with a strong foundation in machine learning, statistics, and data science methodologies 1+ years of hands-on experience applying AI to real-world technology solutions Experience fine-tuning LLM hyperparameters and optimizing model configurations for performance and client impact Hands-on experience with ML Ops and deploying models in production Familiarity with embeddings and other advanced data science techniques Proficiency in Python, TensorFlow, NLP, and deep learning frameworks Demonstrated ability to collaborate with cross-functional teams to prototype, build, test, and scale GenAI/LLM-powered solutions Solid background in REST API development, NoSQL and RDBMS design, and performance optimization Proficiency in cloud platforms such as GCP, Microsoft Fabric, Azure Stack, and Azure ML Proven expertise in generative AI and GPT-based model development Proven advanced skills in data storytelling and visualization using Power BI Proven excellent communication and collaboration skills, especially with non-technical stakeholders Proven agile mindset with a proven ability to contribute effectively in agile teams Preferred Qualifications: Master’s degree in Data Science, Statistics, Computer Science, or a related field Experience working with healthcare data and a solid understanding of the healthcare industry At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #Nic

Posted 1 month ago

Apply

55.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world’s most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days are the same. Your Role Design and manage MongoDB databases for scalability and high availability. Optimize queries and indexing for improved performance. Collaborate with backend and DevOps teams for seamless integration. Monitor, troubleshoot, and resolve database performance issues. Your Profile Proven experience with MongoDB and NoSQL database design. Strong understanding of data modeling, aggregation framework, and indexing. Proficient in writing complex queries and performance tuning. Familiarity with MongoDB Atlas, replication, and sharding. Experience with backend technologies (Node.js, Python, etc.) is a plus. What You’ll Love About Working With Us Flexible work options: remote and hybrid Competitive salary and benefits package Career growth with SAP and cloud certifications Inclusive and collaborative work environment Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 1 month ago

Apply

0 years

1 - 3 Lacs

Pune

On-site

*Job Title: Land Aggregation Executive – Solar Projects *Location: Field Locations across Maharashtra *Head Office: Pransh Business Solutions Pvt. Ltd., Nagpur, Maharashtra *Department: Land Acquisition & Liaison *Reporting To: Land Acquisition Manager / Project Head *Employment Type: Full-Time *Job Summary: Pransh Business Solutions Pvt. Ltd. is seeking proactive and field-ready *Land Aggregation Executives* to support our ongoing *solar power projects across Maharashtra*. The role involves identifying, aggregating, and securing land parcels in coordination with landowners and local authorities. This is a great opportunity for individuals who have a strong understanding of Maharashtra’s land records system and aspire to work in the fields of land revenue, rural development, or infrastructure projects. *Key Responsibilities: * Identify and evaluate land parcels suitable for ground-mounted solar power projects. * Carry out field surveys and collect relevant land documents such as 7/12, Ferfar, 8A, etc. * Interact and negotiate with landowners, farmers, and local intermediaries. * Coordinate with government officials (Talathi, Tehsildar, Circle Officers) for documentation, NOCs, and updates. * Assist legal teams in title search reports (TSR), lease/sale deed execution, and registration. * Monitor mutation, demarcation, and land conversion processes. * Maintain documentation records, land maps, and acquisition trackers. * Travel regularly to rural project sites across Maharashtra and act as a local liaison. *Preferred Candidate Profile: * Graduate in any stream (preferred: Agriculture, Rural Development, Land Management, Law). *Candidates preparing for Talathi, Gram Sevak, or Rural Development Department jobs are highly encouraged to apply. * Good understanding of Maharashtra land records system (7/12, 8A, mutation, NA conversion, etc.). * Strong communication skills in *Marathi (mandatory)*; Hindi and English are added advantages. * Local candidates with rural field experience will be preferred. * Comfortable with frequent travel and on-ground coordination work. *Additional Information: * Compensation:* ₹15,000–₹25,000/month + travel allowance & performance-based incentives *Travel Requirement: Extensive across Maharashtra (site visits, land meetings) Job Type: Full-time Pay: ₹12,000.00 - ₹25,000.00 per month Schedule: Day shift Work Location: In person Speak with the employer +91 8788299907

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Meet the Team The Enterprise Switching organization delivers top-tier products, with the Catalyst Cat9000 series leading in core, aggregation, and access switching for security, IoT, and Cloud. These switches are critical for SD-Access, Cisco's intent-based networking architecture, offering the bandwidth, speed, and scale needed for hybrid work environments. They improve security with PoE power and AI/ML for zero-trust security and provide outstanding network experiences both on-premises and in the cloud through Cisco DNA Center or the Meraki Dashboard. The new Catalyst 9000X models offer flexible operations and secure, high-speed performance. Our team is world-class for crafting industry-leading products that have dominated the enterprise market for decades through innovation. Your Impact Looking forward to getting an exciting start to your career? You will bring your outstanding talents to the group that works on Cisco's Switching technology, which large enterprises now consider to be the Networking Equipment of choice for critically important networks. In your role as a Software Engineer, you will develop and integrate products that are deployed by some of the leading Enterprises in the world. You will work with a BU-wide vibrant technical community, learning from experts and translating this learning into exciting opportunities for personal growth. You will work on networking equipment that forms a crucial backbone of many offices, hospitals, educational and financial institutions. You will learn about pioneering technologies and platforms while developing software for these equipments. Experience the exhilaration of taking a product through development and integration. Minimum Qualifications Good understanding of distributed/centralized hardware architectures of routers/switches. Experience in working with embedded platforms, various operating systems viz Linux, VxWorks , RTOS internals. Have worked on various Device drivers viz PCIE/I2C/eMMC/SDHC/MDIO/USBs and have worked on Storage devices – NAND/NOR with exposure to file system internals. Exposure to system infrastructure - Building and bringing up next generation sophisticated Switches. Experience - 4 to 8 years Bachelors degree or equivalent experience in CS/EE/EC or technical equivalent. Preferred Qualifications : Bachelor’s degree in computer science or related field (MS or equivalent preferred). We Are Cisco #WeAreCisco where every individual brings their unique skills and perspectives together to pursue our purpose of powering an inclusive future for all. Our passion is connection—we celebrate our employees’ diverse set of backgrounds and focus on unlocking potential. Cisconians often experience one company, many careers where learning and development are encouraged and supported at every stage. Our technology, tools, and culture pioneered hybrid work trends, allowing all to not only give their best, but be their best. We understand our outstanding opportunity to bring communities together and at the heart of that is our people. One-third of Cisconians collaborate in our 30 employee resource organizations, called Inclusive Communities, to connect, foster belonging, learn to be advised allies, and make a difference. Dedicated paid time off to volunteer—80 hours each year—allows us to give back to causes we are passionate about, and nearly 86% do! Our purpose, driven by our people, is what makes us the worldwide leader in technology that powers the internet. Helping our customers reinvent their applications, secure their enterprise, transform their infrastructure, and meet their sustainability goals is what we do best. We ensure that every step we take is a step towards a more inclusive future for all. Take your next step and be you, with us!

Posted 1 month ago

Apply

1.5 - 2.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Company Description Etrica Power, is a leading provider of solar energy solutions. Role Description This is a full-time on-site role for a Tender Executive located in Jaipur at Etrica Power. The Tender Executive will be responsible for managing and overseeing the tender process, preparing and submitting bids, coordinating with vendors and suppliers, and ensuring compliance with tender requirements. Qualifications Minimum 1.5-2 years of experience in managing tender processes and submitting bids Excellent communication and negotiation skills Attention to detail and strong organizational skills Knowledge of solar energy industry and products Ability to work effectively in a team and meet deadlines Familiarity with Government e-Marketplace (GeM), eProcurement portals, and tender aggregation sites Proficiency in Excel, MS Office, and document management systems Bachelor's degree in Business Administration, Engineering, or related field Interested candidates can share their cv/resume - shweta@etricapower.com

Posted 1 month ago

Apply

6.0 years

0 Lacs

Ganganagar, Rajasthan, India

On-site

34665BR Bangalore Job Description Orcle Cerner Module: HIE & Integration & Consumer Summary The HIE & Integration & Consumer Solution Consultants configures, tests, and supports Health Information Exchange (HIE) and consumer-facing applications, ensuring seamless data flow across internal systems, external partners, and patient portals. Core Responsibilities Define standards for message transformation, routing, and security protocols (OAuth2, TLS). Oversee Discern CCL/DA2 rule development for data aggregation, consent management, and population health reporting. Conduct impact analyses for new integrations, upgrades, and compliance with interoperability regulations (TEFCA, 21st Century Cures). Serve as escalation point for complex integration or security issues; lead root-cause investigations and remediation. Mentor consultants on integration best practices, API management, and consumer health access. Engage interoperability leadership, security, and IT governance to validate architecture and obtain stakeholder buy-in. Technical Skills & Certifications In-depth configuration experience with Cerner Integration Engine, HIE frameworks, and consumer health modules. Proficiency in CCL, DA2, domain management, HL7, FHIR, IHE, and API security standards. Experience 4–6 years of Cerner enterprise integrations and HIE solutions in large health systems. Proven track record implementing consumer health applications and interoperability platforms. Soft Skills Strong stakeholder management across clinical, technical, and external partner groups. Strategic thinker with excellent communication and facilitation skills. Collaborative and detail-oriented approach. Qualifications B.Tech Range of Year Experience-Min Year 4 Range of Year Experience-Max Year 6

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Diversity role only - Experience range 8-12 years What You'll be doing: The role will be responsible for implementing and supporting Monitoring & Observability requirements and solutions that the Enterprise Monitoring team supports. This will involve providing solutions, contributing to strategic initiatives, incremental improvements and delivering to the requirements for the clients. The resource will act as an individual contributor delivering technical solutions to requirements and acting as a trusted advisor. The candidate will play a key role providing solutions related to Infrastructure monitoring, Application Performance Monitoring and Log aggregation. Automation is key and the resource will create or modify infra as code / automation solutions that the team provides as part of the service. Requirements: Following are experience and essential skills required for this role. Experience in delivering Monitoring & Observability solution to customer requirements. Responsible for delivering outcomes and providing technical expertise to the Monitoring & Observability service. Be the trusted advisor to the customers. Ensure delivery in a timely, efficient and cost-effective manner. Stakeholder management across various Technology and Business teams. Ensures that technical solutions are fit for purpose, including for functional, non-functional and support requirements and to Global Technology Strategic direction. · Key skills will include following implementations in Enterprise scale. Experience in Observability tools, AI Ops, Log aggregation (e.g. ELK), Application Performance Monitoring (e.g. New Relic) Infra and application monitoring skills. Scripting skills like Python, Shell scripts, Java scripts Dev ops skills like Infra as Code (e.g Terraform) , develop CI/CD pipelines, delivering in a SCRUM model. Experience in Open Telemetry, AWS, Kubernetes will be an advantage. Good communication and influencing skills. Strong troubleshooting and analytical skills. The resource is expected to provide on-call support on a periodic basis and change support outside of the business hrs or over weekends. If needed, provide support in different shifts covering UK business hrs.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Apply advanced statistical techniques, data science methodologies, and AI practices, including generative AI using GPT models Design, develop, and implement cutting-edge generative AI models and systems Perform data mining, cleaning, and aggregation to prepare datasets for analysis Manage and query data while ensuring high standards of data quality and integrity Communicate complex insights through compelling data storytelling and visualizations using Power BI Leverage ML Ops practices to deploy and manage machine learning models in production environments Collaborate in an agile team setting, actively participating in sprint planning, reviews, and retrospectives Adhere to company policies, employment terms, and evolving business directives, including flexibility in work location, team assignments, and work arrangements Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 5+ years of experience in data science with a strong foundation in machine learning, statistics, and data science methodologies 1+ years of hands-on experience applying AI to real-world technology solutions Experience fine-tuning LLM hyperparameters and optimizing model configurations for performance and client impact Hands-on experience with ML Ops and deploying models in production Familiarity with embeddings and other advanced data science techniques Proficiency in Python, TensorFlow, NLP, and deep learning frameworks Demonstrated ability to collaborate with cross-functional teams to prototype, build, test, and scale GenAI/LLM-powered solutions Solid background in REST API development, NoSQL and RDBMS design, and performance optimization Proficiency in cloud platforms such as GCP, Microsoft Fabric, Azure Stack, and Azure ML Proven expertise in generative AI and GPT-based model development Proven advanced skills in data storytelling and visualization using Power BI Proven excellent communication and collaboration skills, especially with non-technical stakeholders Proven agile mindset with a proven ability to contribute effectively in agile teams Preferred Qualifications Master’s degree in Data Science, Statistics, Computer Science, or a related field Experience working with healthcare data and a solid understanding of the healthcare industry At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #Nic

Posted 1 month ago

Apply

0 years

0 Lacs

West Delhi, Delhi, India

On-site

Company Description Logistique Biz Pvt Ltd is a leading B2B business and logistics aggregation platform based in West Delhi. With over two decades of experience in the logistics industry, we aim to simplify and streamline B2B transactions, empowering businesses to thrive in an ever-evolving marketplace. Our platform fosters collaboration among businesses and logistics providers to unlock new growth potentials and drive innovation. Role Description This is a full-time on-site role for a Sales Manager at Logistique Biz Pvt Ltd in West Delhi. The Sales Manager will be responsible for managing and growing customer accounts, developing sales strategies, meeting sales targets, and overseeing the sales team's performance. The role will involve building strong relationships with clients, understanding their needs, and providing solutions to optimize their supply chains. Qualifications Sales Strategy Development and Execution skills Customer Relationship Management and Account Management skills Sales Target Achievement and Team Management skills Excellent Communication and Negotiation skills Experience in the logistics or supply chain industry Bachelor's degree in Business Administration, Marketing, or related field Proven track record of successful sales performance

Posted 1 month ago

Apply

0.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Designation: Senior Analyst Level: L2 Experience: 4 to 7 years Location: Chennai Job Description: We are seeking a Data Analyst to support our growing Risk Analytics and Insight team within Intuit’s Global Business & Self-Employed Group (GBSG). The ideal candidate will work across functions (Fraud & Risk Operations, Data Science, Compliance, Marketing, Product, Finance, etc.) and use data to drive insights which help achieve Intuit Small Business Group’s short and long-term business outcomes. Responsibilities: Develop compelling visualizations that crystallize business opportunities for business stakeholders and leaders to make timely decisions. Use forecasting and predictive analytics techniques in developing insights and dashboards and communicate/ collaborate with partners on derived impacts to business. Mine massive scale transactional data to improve the fraud and credit risk mitigation strategies and design new risk-adjusted policy proposals. Ensure data collection and data processing is optimized to support new initiatives and product releases. Skills: Advanced SQL and coding skills to perform data segmentation and aggregation from scratch. Good working experience with data analytics and visualization tools like Tableau and Quicksight Knowledge of statistical analysis including A/B and multivariate testing. Written and verbal communication skills. Experience working with Key Performance Indicator (KPI or OKR) metrics. Job Snapshot Updated Date 30-06-2025 Job ID J_3783 Location Chennai, Tamil Nadu, India Experience 4 - 7 Years Employee Type Permanent

Posted 1 month ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Work Location : PAN India Duration : 12 Months (Extendable) Shift : Rotational shifts including night shifts and weekend availability Years of Experience : 8+ Years 🔧 Job Summary We are seeking an experienced and versatile Site Reliability Engineer (SRE) / Observability Engineer to join our project delivery team. The ideal candidate will bring a deep understanding of modern cloud infrastructure, monitoring tools, and automation practices to ensure system uptime, scalability, and performance across a distributed environment. 🎯 Key Responsibilities Site Reliability Engineering Design, build, and maintain scalable, reliable infrastructure. Automate provisioning/configuration using tools like Terraform, Ansible, Chef, or Puppet. Develop automation tools/scripts in Python, Go, Java, or Bash. Administer and optimize Linux/Unix systems and network components (TCP/IP, DNS, load balancers). Deploy and manage infrastructure on AWS or Kubernetes platforms. Build and maintain CI/CD pipelines (e.g., Jenkins, ArgoCD). Monitor production systems with tools such as Prometheus, Grafana, Nagios, Datadog. Conduct postmortems and define SLAs/SLOs to ensure high system reliability. Plan and implement capacity management, failover systems, and auto-scaling mechanisms. Observability Engineering Instrument services for metrics/logs/traces using OpenTelemetry, Prometheus, Jaeger, etc. Manage observability stacks (e.g., Grafana, ELK Stack, Splunk, Datadog, Honeycomb). Work with time-series databases (e.g., InfluxDB, Prometheus) and log aggregation tools. Build actionable alerts and dashboards to reduce alert fatigue and increase insight. Advocate for observability best practices with developers and define performance KPIs. ✅ Required Skills & Qualifications Proven experience as an SRE or Observability Engineer in production environments. Strong Linux/Unix and cloud infrastructure skills (especially AWS, Kubernetes). Proficient in scripting and automation (Python, Go, Bash, Java). Expertise in observability, monitoring, and alerting systems. Experience in Infrastructure as Code (IaC) and modern CI/CD practices. Strong troubleshooting skills and ability to respond to live production issues. Comfortable with rotational shifts, including nights and weekends. 🔍 Mandatory Technical Skills Ansible AWS Automation Services AWS CloudFormation AWS CodePipeline AWS CodeDeploy AWS DevOps Services

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Your Responsibilities We are seeking an experienced and highly motivated Sr Data Engineer - Data Ingestion to join our dynamic team. The ideal candidate will have strong hands-on experience with Azure Data Factory (ADF), a deep understanding of relational and non-relational data ingestion techniques, and proficiency in Python programming. You will be responsible for designing and implementing scalable data ingestion solutions that interface with Azure Data Lake Storage Gen 2 (ADLS Gen 2), Databricks, and various other Azure ecosystem services. The Data Ingestion Engineer will work closely with stakeholders to gather data ingestion requirements, create modularized ingestion solutions, and define best practices to ensure efficient, robust, and scalable data pipelines. This role requires effective communication skills, ownership, and accountability for the delivery of high-quality data solutions. Data Ingestion Strategy & Development: Design, develop, and deploy scalable and efficient data pipelines in Azure Data Factory (ADF) to move data from multiple sources (relational, non-relational, files, APIs, etc.) into Azure Data Lake Storage Gen 2 (ADLS Gen 2), Azure SQL Database, and other target systems. Implement ADF activities (copy, lookup, execute pipeline, etc.) to integrate data from on-premises and cloud-based systems. Build parameterized and reusable pipeline templates in ADF to standardize the data ingestion process, ensuring maintainability and scalability of ingestion workflows. Integrate custom data transformation activities within ADF pipelines, utilizing Python, Databricks, or Azure Functions when required. ADF Data Flows Design & Development: Leverage Azure Data Factory Data Flows for visually designing and orchestrating data transformation tasks, enabling complex ETL (Extract, Transform, Load) logic to process large datasets at scale. Design data flow transformations such as filtering, aggregation, joins, lookups, and sorting to process and transform data before loading it into target systems like ADLS Gen 2 or Azure SQL Database. Implement incremental loading strategies in Data Flows to ensure efficient and optimized data ingestion for large volumes of data while minimizing resource consumption. Develop reusable data flow components to streamline transformation processes, ensuring consistency and reducing development time for new data ingestion pipelines. Utilize debugging tools in Data Flows to troubleshoot, test, and optimize data transformations, ensuring accurate results and performance. ADF Orchestration & Automation: Use ADF triggers and scheduling to automate pipeline execution based on time or events, ensuring timely and efficient data ingestion. Configure ADF monitoring and alerting capabilities to proactively track pipeline performance, handle failures, and address issues in a timely manner. Implement ADF version control practices using Git to manage code changes, collaborate effectively with other team members, and ensure code integrity. Data Integration with Various Sources: Ingest data from diverse sources such as on-premise SQL Servers, REST APIs, cloud databases (e.g., Azure SQL Database, Cosmos DB), file-based systems (CSV, Parquet, JSON), and third-party services using ADF. Design and implement ADF linked services to securely connect to external data sources (databases, file systems, APIs, etc.). Develop and configure ADF datasets and dataflows to efficiently transform, clean, and load data into Azure Data Lake or other destinations. Pipeline Monitoring and Optimization: Continuously monitor and optimize ADF pipelines to ensure they run with high performance and minimal cost. Apply techniques like data partitioning, parallel processing, and incremental loading where appropriate. Implement data quality checks within the pipelines to ensure data integrity and handle data anomalies or errors in a systematic manner. Review pipeline execution logs and performance metrics regularly, and apply tuning recommendations to improve execution times and reduce operational costs. Collaboration and Communication: Work closely with business and technical stakeholders to capture and translate data ingestion requirements into ADF pipeline designs. Provide ADF-specific technical expertise to both internal and external teams, guiding them in the use of ADF for efficient and cost-effective data pipelines. Document ADF pipeline designs, error handling strategies, and best practices to ensure the team can maintain and scale the solutions. Conduct training sessions or knowledge transfer with junior engineers or other team members on ADF best practices and architecture. Security and Compliance: Ensure all data ingestion solutions built in ADF follow security and compliance guidelines, including encryption at rest and in transit, data masking, and identity and access management. Implement role-based access control (RBAC) and managed identities within ADF to manage access securely and reduce the risk of unauthorized access to sensitive data. Integration with Azure Ecosystem: Leverage other Azure services, such as Azure Logic Apps, Azure Function Apps, and Azure Databricks, to augment the capabilities of ADF pipelines, enabling more advanced data processing, event-driven workflows, and custom transformations. Incorporate Azure Key Vault to securely store and manage sensitive data (e.g., connection strings, credentials) used in ADF pipelines. Integrate ADF with Azure Data Lake Analytics, Synapse Analytics, or other data warehousing solutions for advanced querying and analytics after ingestion. Best Practices & Continuous Improvement: Develop and enforce best practices for building and maintaining ADF pipelines and data flows, ensuring the solutions are modular, reusable, and follow coding standards. Identify opportunities for pipeline automation to reduce manual intervention and improve operational efficiency. Regularly review and suggest new tools or services within the Azure ecosystem to enhance ADF pipeline performance and increase the overall efficiency of data ingestion workflows. Incident and Issue Management: Actively monitor the health of the data pipelines, swiftly addressing any failures, data quality issues, or performance bottlenecks. Troubleshoot ADF pipeline errors, including issues within Data Flows, and work with other teams to root-cause issues related to data availability, quality, or connectivity. Participate in post-mortem analysis for any major incidents, documenting lessons learned and implementing preventative measures for the future. Your Profile Experience with Azure Data Services: Strong experience with Azure Data Factory (ADF) for orchestrating data pipelines. Hands-on experience with ADLS Gen 2, Databricks, and various data formats (e.g., Parquet, JSON, CSV). Solid understanding of Azure SQL Database, Azure Logic Apps, Azure Function Apps, and Azure Container Apps. Programming and Scripting: Proficient in Python for data ingestion, automation, and transformation tasks. Ability to write clean, reusable, and maintainable code. Data Ingestion Techniques: Solid understanding of relational and non-relational data models and their ingestion techniques. Experience working with file-based data ingestion, API-based data ingestion, and integrating data from various third-party systems. Problem Solving & Analytical Skills Communication Skills #IncludingYou Diversity, equity, inclusion and belonging are cornerstones of ADM’s efforts to continue innovating, driving growth, and delivering outstanding performance. We are committed to attracting and retaining a diverse workforce and create welcoming, truly inclusive work environments — environments that enable every ADM colleague to feel comfortable on the job, make meaningful contributions to our success, and grow their career. We respect and value the unique backgrounds and experiences that each person can bring to ADM because we know that diversity of perspectives makes us better, together. For more information regarding our efforts to advance Diversity, Equity, Inclusion & Belonging, please visit our website here: Diversity, Equity and Inclusion | ADM. About ADM At ADM, we unlock the power of nature to provide access to nutrition worldwide. With industry-advancing innovations, a complete portfolio of ingredients and solutions to meet any taste, and a commitment to sustainability, we give customers an edge in solving the nutritional challenges of today and tomorrow. We’re a global leader in human and animal nutrition and the world’s premier agricultural origination and processing company. Our breadth, depth, insights, facilities and logistical expertise give us unparalleled capabilities to meet needs for food, beverages, health and wellness, and more. From the seed of the idea to the outcome of the solution, we enrich the quality of life the world over. Learn more at www.adm.com. Req/Job ID 97477BR Ref ID

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies