Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Gurugram, Haryana, India
Remote
JOB PURPOSE As a Sr. HRIS Analyst, you are responsible for implementing, maintaining, and optimizing the SuccessFactors platform at Sloan, which includes modules such as Employee Central (core HR), Recruiting, Onboarding, Performance & Goals, Learning, Compensation, Succession & Development, and Analytics. The role involves coordinating data and system integrations between SuccessFactors and other platforms, including ADP, Bswift, SAP, TeamSense, Jira, and UKG Pro Workforce Management. You will identify potential system and process enhancements, document both business and technical requirements, and ensure delivery according to specifications while working with system support vendors and collaborating with Sloan's HR, IT, Operations, and Payroll departments. JOB DUTIES AND RESPONSIBILITIES Serve as the primary HRIS contact for SuccessFactors, including configuration, troubleshooting issues, ensuring the data in the system is compliant with HR processes and laws, reporting, and end-user support. SuccessFactors – Techno-Functional: Provide Tier 1 & Tier 2 technical support for SuccessFactors modules ( Employee Central (core HR), Recruiting, Onboarding, Performance & Goals, Learning, Compensation, Succession & Development, and Analytics). Optimize system functionality and processes. Maintain employee data accuracy and system compliance with HR and legal standards. Assist with imports, data loads, and system integrations with third-party tools. Conduct user testing, implement enhancements, and support configuration needs. Generate cyclical and ad hoc reports. Create dashboards and analytics. Act as liaison with customer support and/or consultants for HRIS technology cases. Create and maintain Standard Operating Procedures (SOPs), training documents, and workflows. Draft and execute detailed test scenarios for system changes and upgrades. Lead HRIS projects, including new module rollouts, upgrades, and integrations. Develop timelines, monitor progress, and ensure timely delivery of quality solutions. Identify and escalate issues, track risks, and ensure follow-through on corrective actions. Deliver training sessions to staff, managers, and end-users on new system features and best practices. Other duties and responsibilities as required. REQUIRED QUALIFICATIONS Bachelor's Degree in in Human Resources, Information Technology, Business Administrator or another related field. 5+ Years Working as a techno-functional systems Analyst in SuccessFactors Employee Central (core HR), Recruiting, Onboarding, Performance & Goals, Learning, Compensation, Succession & Development, and Analytics modules. SAP Certified Associate - Employee Central, Recruiting, Onboarding, Performance & Goals, Succession & Development, Learning, and Compensation. Ability to use discretion when working with confidential information. Actively seeks information to understand customers' circumstances, problems, expectations, and needs. Advanced level experience working with Microsoft Office Suite (Word, Excel, PowerPoint, etc.). Excellent written and verbal communication skills in English. Experience supporting U.S.-based teams and navigating time zone overlap requirements. Experience working independently in a global, remote HR or shared services environment. Familiar with US laws relating to Human Resource processes and operations. Must be available during core U.S. working hours (full or partial overlap as agreed). Strong attention to detail, documentation, and stakeholder management. Strong reporting skills, familiarity with query tools or SQL is a plus. PREFERRED QUALIFICATIONS SuccessFactors Expert (SFX) Accreditation US Shift : 07pm IST to 03:30am IST
Posted 5 days ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM
Posted 5 days ago
4.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM
Posted 5 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job Description: Automation Tester: As a Test Leads / Architect who is passionate about seeing customer succeeded and ensuring best-in-class product quality. The objective of the role is End-to-End ownership of testing of the product and be the custodian of product quality. With the Digital Engineering team, you will have the opportunity to join a fast-growing team that is embarking on a multi-year implementation as part of an on-going digital modernization effort. As the project team ramps up, you will have the chance to help define and shape the vision of how the solution will be maintained and monitored to meet the business’ needs. Experience Level: 5+ Roles and Responsibilities: Experience in Telecom Industry Application is MUST Experience in testing CRM, Web Application, Billing & Order Management Have deep experience of system level debugging (including customer issues) with good understanding of managing and triaging production level issues Should have experience in database query language such as SQL, so that they can query and validate the production data for analysis. Exposer to Database / ETL Testing. Should have understanding for Microsoft Azure Cloud Environment. Responsible for maintaining QE health through facilitated defect management using standardized triage, defect management, and communication processes. Monitor trends and improvement opportunities using Mean Time to Resolution, defect RCAs, and environment downtime as key performance indicators. Conduct daily defect review with key program stakeholders, including dev, test, product, and support teams to ensure the path forward and reduce defect burn down for all prioritized defects. Monitor and ensure that overall defect processes are aligned with a standard one defect process across all test phases and teams. Provide Test Estimation to leads for Intake planning Should have a good understanding of API testing and be able to perform API testing using relevant tools like Postman, REST Assured or SoapUI Experience in Test/Defect Management tool (Preferably in JIRA/Zephyr) Partners with other leads & Architect for test planning, Assignment & reporting Monitor chats and host working sessions with impacted teams and ensure there is a path forward and active investigation on defects until disposition. Escalation Point for Dispute defects & resolution Conduct root cause analysis for defects and ensure the defect is closed under the right root cause category. Mitigate impediments and fosters a work environment for high-performing team dynamics, continuous the team’s workflow, and relentless improvements Responsible for designing holistic test architecture and Test solutions in alignment with business requirements and solution specifications Works with for Test Data team for Test Data Requirement & Fulfillment Primary / Mandatory skills: 5+ years’ experience in Product Testing with minimum 3+ years of experience on Defect Management/Production Validation Testing. Proven experience in Testing/defect management and triaging in a fast-paced software development environment. Experience in using defect tracking tools like JIRA and creating reports using Power BI. Experience of system level debugging (including customer issues) with good understanding of managing and triaging production level issues Should be familiar with database query language such as SQL, so that they can query and validate the production data for analysis Should have exposure to Test Automation (Selenium) Should have a good understanding of API testing and be able to perform API testing using relevant tools like Postman, REST Assured or SoapUI Experience in Test/Defect Management tool (Preferably in JIRA/Zephyr) Expertise in preparing daily status reports/dashboards to Management Decision maker in Entry and Exit of internal /development Testing stages Should be able to Co-ordinate proactively with Testing Team, Dev Leads & Other Members Expertise in Risk Identification & analysis Proven expertise in Agile software development especially Scrum and Kanban Experience in Telecom Industry is added advantage Strong written and verbal communication skills Technical Skills: Selenium , JAVA, JavaScript, JMeter, Rest Assured, SQL, Maven, Eclipse/VS Code, Bitbucket, JIRA, Jenkins, Git/GitHub, DevOps, Postman Additional information (if any): Willing to work in Shift Duties, Willingness to learn is very important as AT&T offers excellent environment to learn Digital Transformation skills such as cloud etc. Weekly Hours: 40 Time Type: Regular Location: Bangalore, Karnataka, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.
Posted 5 days ago
0.0 - 10.0 years
0 Lacs
Tamil Nadu
On-site
Aditya Birla Money Limited Senior Manager - Accounts Payable Location: Chennai-HO-Guindy IE, Tamil Nadu Position / Job Title (Proposed) Section Head - Accounts Payable Designation Manager Function Accounts Department Accounts Reporting To (Title) HOD - ACCOUNTS Superior’s Superior (Title) CFO 1) Job Purpose To be responsible for monitoring & authorize the entire payment process of the company and ensure funds of the Company are used only for the specific approved purpose. Responsible for data security and confidentiality of sensitive information of the Company. Responsible to comply all statutory commitments by all means – payment, return filing, certificate submission to the statutory bodies. To co-ordinate end-to-end for ALL audit deliverables and assure smooth completion of audit and ensure expenses accounting reflect accurate in the Financial statements of the Company. 2) Dimensions: Other Quantitative and Important Parameters for the job: Budgets/ Volumes/No. of Products/Geography/ Markets/ Customers or any other parameter Responsible to verify and authorize Vendor payments. Employee reimbursements, Business payout Payment and ensure accounting entry in Books of Accounts. Responsible for BRS – 14 banks. Information Security & Confidentiality of sensitive data to be maintained and it is the responsibility of the job holder to ensure a process is in place for the same. Statutory payments of PF, ESI, LWF, GST, TDS are released on time and evidence maintained for documentation purposes. Tax compliance of all payment related entries and ensure no payment is released without deducting TDS. Appropriate tax rate to be applied for tax with-hold. Expense Provision for Monthly, Quarterly & yearly Book Closure. End – end – responsible for Data collation to meet Auditors requirement. Verification of Sales team Incentive workings and booking expense booking. Quarterly LR audit plan to be split into monthly and data collection from other departments. Quarterly Vendor Ageing Analysis & GL review. Drive automation initiatives as a regular process and implement once the automation is through. Diplomatic query handling and no inappropriate message to be communicated in the reply. Every process is to be documented by way of SOP – approved by HOD. Fund management and arrangement for payment release. 3) Job Context & Major Challenges (What are the specific aspects of the job that provide a challenge to the jobholder in the context of the Unit/Zone? Job holder is responsible to validate the payment processing initiated by the maker and release payment – Vendor payment + Business pay-out + Employee reimbursements. Next major job is audit co-ordination; being listed entity ABML is subject to quarterly LR audit. Audit plan to be drawn on discussion with functional owners and ensure smooth completion of audit by providing data for 3 months in the limited audit time. Periodic MIS to internal and external stakeholders and query handling pertaining to the same. Certificates and Reports in prescribed form to be submitted to Exchanges and other Regulators. Responsible for reconciliations, ledger reviews, initiate automation requests, preparing data dump to meet MIS requirements. The major challenges are even distribution of time to meet various payment requests that come up for release simultaneously. Explain the type of data requirement, consolidate it in required form and provide to auditors within timeline. Making the Branch managers and executives interactive and to adhere to the process is also a challenge to be overcome. Execute the plan of activities as per timelines. Make automation initiatives a continuous process and implement the same. 4) Principal Accountabilities Accountability Supporting Actions Audit co-ordination Audit plan to be drawn for every quarter LR and internal audit & yearly statutory audit and to be executed as per timeline set. Call for discussion with other departments, explain audit plan, and get data delivered to auditors as per their standards. Payment release Authorize payment for the approved expenses and ensure no double payment, excess payment and strict adherence to the process. Responsible to monitor & verify Vendor payment requests processed by the maker and related accounting entries. Checker for Business payout & Incentive calculation as per approved schemes, accurate & timely release. Accuracy Periodic review & scrutiny of the ledgers by way of verification, DoA check, tax compliance, budget, book entry, actual payment release, bank instruction and investigate any abnormal ageing balances and initiate corrective action. Agreed TAT to be maintained. To complete accounting and payment activities for timely closure of books of accounts – monthly. To approve for the accounting and release of all payments as per DOA. Review DOA at periodic intervals with input from all concerned department heads. Monitoring JVs, Provision entries & capital expenditures. GL Reconciliation. MIS & Dashboard on BP payout and circulate to stake – holders. Check the Operations Accounting entries pertaining to BP activities. Statutory compliances Handle exchange inspections and provide data. NW certificate to be provided timely to the Regulators. PMS audit certificate to be facilitated to PMS clients. Query handling of all stake holders – internal & external Automation initiatives Constant drive automation plans and co-ordinate with IT to make it live. 5) Job Purpose of Direct Reports Responsible to calculate payouts for Franchisee partners, Direct Selling Agents, Branch Sub-brokers get is approved from reporting manager and process payment after accounting in Books. Responsible to collect, verify, check approval, account and process payment for Employee reimbursements – mobile, travel, conveyance, business promotion expenses. Head office & Branch Expense management – verify approval, budget, correctness and release payment on timely basis. Business Partner Operational accounting – full and final settlement and exceptional payment. Submission of Statutory certificates to Regulatory & Auditors. Scrutinize Books of Accounts and ageing analysis report. Minimum Experience Level 6 - 10 years Job Qualifications Post Graduate
Posted 5 days ago
0.0 years
0 Lacs
Hyderabad, Telangana
Remote
Data Scientist II Hyderabad, Telangana, India Date posted Aug 01, 2025 Job number 1854865 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Research, Applied, & Data Sciences Discipline Data Science Employment type Full-Time Overview Security represents the most critical priorities for our customers in a world awash in digital threats, regulatory scrutiny, and estate complexity. Microsoft Security aspires to make the world a safer place for all. We want to reshape security and empower every user, customer, and developer with a security cloud that protects them with end to end, simplified solutions. The Microsoft Security organization accelerates Microsoft’s mission and bold ambitions to ensure that our company and industry is securing digital technology platforms, devices, and clouds in our customers’ heterogeneous environments, as well as ensuring the security of our own internal estate. Our culture is centered on embracing a growth mindset, a theme of inspiring excellence, and encouraging teams and leaders to bring their best each day. In doing so, we create life-changing innovations that impact billions of lives around the world. The Defender Experts (DEX) Research team is at the forefront of Microsoft’s threat protection strategy, combining world-class hunting expertise with AI-driven analytics to protect customers from advanced cyberattacks. Our mission is to move protection left—disrupting threats early, before damage occurs—by transforming raw signals into intelligence that powers detection, disruption, and customer trust. We’re looking for a passionate and curious Data Scientist to join this high-impact team. In this role, you'll partner with researchers, hunters, and detection engineers to explore attacker behavior, operationalize entity graphs, and develop statistical and ML-driven models that enhance DEX’s detection efficacy. Your work will directly feed into real-time protections used by thousands of enterprises and shape the future of Microsoft Security. This is an opportunity to work on problems that matter—with cutting-edge data, a highly collaborative team, and the scale of Microsoft behind you. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Bachelor’s or Master’s degree in Computer Science, Statistics, Applied Mathematics, Data Science, or a related quantitative field 3+ years of experience applying data science or machine learning in a real-world setting, preferably in security, fraud, risk, or anomaly detection Proficiency in Python and/or R, with hands-on experience in data manipulation (e.g., Pandas, NumPy), modeling (e.g., scikit-learn, XGBoost), and visualization (e.g., matplotlib, seaborn) Strong foundation in statistics, probability, and applied machine learning techniques Experience working with large-scale datasets, telemetry, or graph-structured data Ability to clearly communicate technical insights and influence cross-disciplinary teams Demonstrated ability to work independently, take ownership of problems, and drive solutions end-to-end Responsibilities Understand complex cybersecurity and business problems, translate them into well-defined data science problems, and build scalable solutions. Design and build robust, large-scale graph structures to model security entities, behaviors, and relationships. Develop and deploy scalable, production-grade AI/ML systems and intelligent agents for real-time threat detection, classification, and response. Collaborate closely with Security Research teams to integrate domain knowledge into data science workflows and enrich model development. Drive end-to-end ML lifecycle: from data ingestion and feature engineering to model development, evaluation, and deployment. Work with large-scale graph data: create, query, and process it efficiently to extract insights and power models. Lead initiatives involving Graph ML, Generative AI, and agent-based systems, driving innovation across threat detection, risk propagation, and incident response. Collaborate closely with engineering and product teams to integrate solutions into production platforms. Mentor junior team members and contribute to strategic decisions around model architecture, evaluation, and deployment. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 5 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
General Information Req # WD00086226 Career area: Data Management and Analytics Country/Region: India State: Karnataka City: BANGALORE Date: Friday, August 1, 2025 Working time: Full-time Additional Locations : India - Karnātaka - Bangalore India - Karnātaka - BANGALORE Why Work at Lenovo We are Lenovo. We do what we say. We own what we do. We WOW our customers. Lenovo is a US$57 billion revenue global technology powerhouse, ranked #248 in the Fortune Global 500, and serving millions of customers every day in 180 markets. Focused on a bold vision to deliver Smarter Technology for All, Lenovo has built on its success as the world’s largest PC company with a full-stack portfolio of AI-enabled, AI-ready, and AI-optimized devices (PCs, workstations, smartphones, tablets), infrastructure (server, storage, edge, high performance computing and software defined infrastructure), software, solutions, and services. Lenovo’s continued investment in world-changing innovation is building a more equitable, trustworthy, and smarter future for everyone, everywhere. Lenovo is listed on the Hong Kong stock exchange under Lenovo Group Limited (HKSE: 992) (ADR: LNVGY). This transformation together with Lenovo’s world-changing innovation is building a more inclusive, trustworthy, and smarter future for everyone, everywhere. To find out more visit www.lenovo.com, and read about the latest news via our StoryHub. Description and Requirements BS/BA in Computer Science, Mathematics, Statistics, MIS, or related At least 5 years' experience in the data warehouse space. At least 5 years' experience in custom ETL/ELT design, implementation and maintenance. At least 5 years' experience in writing SQL statements. At least 3 years' experience with Cloud based data platform technologies such as Google Big Query, or Azure/Snowflake data platform equivalent. Ability in managing and communicating data warehouse plans to internal clients. Additional Locations : India - Karnātaka - Bangalore India - Karnātaka - BANGALORE India India - Karnātaka * India - Karnātaka - Bangalore , * India - Karnātaka - BANGALORE NOTICE FOR PUBLIC At Lenovo, we follow strict policies and legal compliance for our recruitment process, which includes role alignment, employment terms discussion, final selection and offer approval, and recording transactions in our internal system. Interviews may be conducted via audio, video, or in-person depending on the role, and you will always meet with an official Lenovo representative. Please beware of fraudulent recruiters posing as Lenovo representatives. They may request cash deposits or personal information. Always apply through official Lenovo channels and never share sensitive information. Lenovo does not solicit money or sensitive information from applicants and will not request payments for training or equipment. Kindly verify job offers through the official Lenovo careers page or contact IndiaTA@lenovo.com. Stay informed and cautious to protect yourself from recruitment fraud. Report any suspicious activity to local authorities.
Posted 5 days ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM
Posted 5 days ago
0.0 - 4.0 years
6 - 7 Lacs
Bengaluru, Karnataka
On-site
Walk In Interview Scheduled on 04-Aug-2025. Report Between 10 to 11 AM Please carry your laptop to take Excel Test Job Description: Knowledge on banking process and contact centre Knowledge on advanced MS excel formula's, Pivots and VBA Knowledge on BI tool Analytical skill The candidate will support on BI team’s BAU reports and adhoc requirement. The candidate will also be part of stake holder meetings understanding the requirement and produce reports on his own. Work Experience: 5+ years in a Contact Centre (Preference for candidates with knowledge in Banking Processes and Contact Centre operations) Skills Required: VBA, Advanced Excel, BI Tools, Power Pivot, Power Query Work Model: Work from Office / Client Location Annual CTC: Rs.6 LPA – Rs.7 LPA Preference: Immediate Joiners & Male Candidates BUSINESS LOCATION/INTERVIEW VENUE WyzMindz Solutions Private Limited Address – AROHANA, 19/3, 3rd Floor, Srinivasa Industrial Estate Behind RMS International School & PU college, Kanakapura Rd, Konanakunte, Bengaluru, Karnataka 560062 Landmark - Near Yelachenahalli Metro Station, Kanakapura road Metro Pillar No: 127 Google Map - https://goo.gl/maps/mNN9R37hG4UsP4rN8 Job Type: Full-time Pay: ₹600,000.00 - ₹700,000.00 per year Benefits: Provident Fund Experience: MIS: 4 years (Required) Work Location: In person
Posted 5 days ago
0 years
0 Lacs
Pune, Maharashtra, India
Remote
Senior MSSQL Database Administrator Location : Pune, India (On-site) Employment Type : Full-time Job Overview We are seeking a highly skilled and experienced Senior MSSQL Database Administrator to join our enterprise technology team at Pansoft Technologies LLC. This is a full-time, on-site role in Pune. You will be responsible for the comprehensive administration, design, troubleshooting, and management of Microsoft SQL Server databases, with a strong focus on cluster installation, high availability (HA), disaster recovery (DR), performance tuning, and replication. You will ensure the continuous availability, performance, and integrity of critical database systems. Key Responsibilities Database Administration and Troubleshooting : Provide expert-level database administration, including installation, configuration, patching, and upgrades of MSSQL Server instances. Proactively monitor database health, performance, and capacity, identifying and resolving complex issues. Perform root cause analysis for database-related incidents and implement preventive measures. Database Design And Management Collaborate on database design and schema modifications to support application development and optimization. Manage database objects, user permissions, and security configurations. Implement and enforce database best practices and standards. Database Replication Design, configure, and maintain various types of MSSQL Server replication (Snapshot, Transactional, Merge) to ensure data synchronization and consistency. Monitor replication lag and troubleshoot replication agents and distributors. Cluster Installation And Setup Lead Cluster Configuration: Install and configure SQL Server clustering software, specifically focusing on SQL Server Always On Availability Groups. Ensure Node Configuration : Configure and prepare each node in the cluster environment, verifying correct configurations, storage paths, and required access for each SQL Server instance. Cluster High Availability (HA) and Disaster Recovery (DR) Setup : Implement Failover Mechanisms : Configure automatic failover within SQL Server Always On Availability Groups to ensure high availability in case of node failure. Design and configure Disaster Recovery Solutions : Implement robust disaster recovery using Always On secondary replicas to geographically remote sites for minimal downtime and data loss. Develop comprehensive Backup Strategies for Clustered Databases : Ensure accurate configuration of full, differential, and transaction log backups for the entire cluster, including cloud-based backups (e.g., Backup to URL) and replication to secondary nodes. Database Instance And Resource Management Manage CPU, memory, and I/O allocation across cluster nodes, configuring load balancing within SQL Server Always On to distribute query traffic across secondary replicas for optimal read performance. Conduct extensive Clustered Database Performance Tuning : Monitor and optimize the performance of the SQL Server cluster, adjusting settings like MaxDOP and identifying bottlenecks. Database Synchronization And Replication Maintain data consistency across all nodes in the cluster, managing replication lag in SQL Server Always On and ensuring real-time synchronization. Configure and maintain data replication between primary and secondary cluster nodes for availability and redundancy. Handle and resolve any data conflicts that may arise during replication, ensuring continuous data consistency across all cluster nodes. Database Cluster Monitoring And Alerts Implement continuous monitoring of the SQL Server cluster for performance metrics, system health, and failover status, including disk space, CPU usage, memory consumption, and error/warning logs. Establish robust alerting mechanisms (e.g., SQL Server Agent, custom scripts) to notify administrators of failures, performance degradation, or other critical issues within the cluster (e.g., replication delays, node failure). Qualifications Database Administration and Troubleshooting skills. Experience in Database Design and Databases management. Experience in Replication of databases. Excellent problem-solving and communication skills. Ability to work effectively in a team environment. Relevant certifications in MSSQL or database administration (e.g., MCSA: SQL 2016 Database Administration, MCSE: Data Management and Analytics). Bachelor's degree in Computer Science, Information Technology, or a related field. (ref:hirist.tech)
Posted 5 days ago
47.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Senior Full Stack Developer (.NET + SQL + Angular + AWS) Experience : 47 Years Location : Mumbai Kalina, Santacruz Employment Type : Full-Time Notice Period : Immediate to 7 Days Industry Preference : Candidates with finance or banking domain experience will be Overview : We are seeking a Senior Full Stack Developer with strong backend expertise in .NET Core (C#) and SQL Server, along with working knowledge of Angular for frontend development. The ideal candidate should be capable of delivering high-performance, scalable backend services while also contributing to frontend components as needed. This role is best suited for someone who thrives in a fast-paced environment, has experience working with large datasets, and can handle full-stack responsibilities with a backend-first mindset. Key Responsibilities Design, develop, and maintain scalable backend services using .NET Core / C#. Create, manage, and optimize complex SQL queries, stored procedures, triggers, and indexing strategies. Develop RESTful APIs and integrate with internal and external systems. Collaborate with frontend developers to build and maintain UI components using Angular (v10+). Ensure application performance, reliability, and security. Contribute to cloud deployment and architecture on AWS. Work closely with Product Owners and Business Analysts to translate business requirements into technical solutions. Conduct unit testing, participate in peer code reviews, and assist in system deployments. Maintain comprehensive technical documentation of systems and Skills & Qualifications : 47 years of experience in .NET Core / C# backend development. Strong hands-on experience with SQL Server (query optimization, indexing, performance tuning). Experience building and consuming RESTful APIs. Good understanding of Angular (v10+) and TypeScript. Working knowledge of AWS services like EC2, S3, Lambda, RDS (or equivalent experience on Azure/GCP). Familiarity with CI/CD pipelines, Git, and code versioning practices. Solid grasp of OOP principles, design patterns, and clean coding standards. Exposure to Agile/Scrum development methodologies. Excellent problem-solving and debugging skills. Strong verbal and written communication skills. Nice To Have Experience in finance or banking domain. Exposure to microservices architecture. Knowledge of containerization (Docker, Kubernetes). Experience with reporting tools like SSRS or Power BI. Understanding of automated testing frameworks and test-driven development. (ref:hirist.tech)
Posted 5 days ago
48.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
We are seeking a highly analytical Quantitative Research Analyst to join our AIF Quant Fund team. The ideal candidate brings strong buy-side experience in systematic investment strategies and will play a key role in developing, implementing, and maintaining sophisticated quantitative models that drive our alternative investment approach. About The Role The Quantitative Research Analyst will be responsible for various key responsibilities that include model development, data infrastructure management, strategy testing, risk management, and technology : Model Development & Research Design and build multi-factor models for equity, fixed income, and alternative asset classes. Develop alpha generation signals and systematic trading strategies across multiple time horizons. Research and implement new quantitative factors using academic literature and market insights. Enhance existing models through continuous performance monitoring and iterative improvements. Data Infrastructure & Analytics Manage large-scale financial datasets using Snowflake, SQL, and cloud-based platforms. Build automated data pipelines for real-time and historical market data processing. Ensure data quality, integrity, and optimize query performance for research workflows. Develop efficient storage solutions for multi-asset research environments. Strategy Testing & Validation Conduct comprehensive back testing across multiple market cycles using robust statistical methods. Perform out-of-sample testing, walk-forward analysis, and Monte Carlo simulations. Generate detailed performance attribution and risk decomposition analysis. Document model assumptions, limitations, and validation results. Risk Management & Monitoring Build risk management frameworks including VaR, stress testing, and scenario analysis. Monitor portfolio exposures, concentration risks, and factor loadings in real-time. Develop automated alerting systems for model degradation and performance anomalies. Support portfolio optimization and construction processes. Technology & Automation Develop Python-based research and production systems with focus on scalability. Create automated model monitoring, reporting, and alert generation frameworks. Collaborate on technology infrastructure decisions and platform evaluations. Maintain code quality and documentation standards. Qualifications Professional Experience : 48 years of buy-side quantitative research in asset management, hedge funds, or proprietary trading. Proven track record in systematic investment strategy development and implementation. Experience with institutional-grade quantitative research and portfolio management. Technical Proficiency Programming : Advanced Python (pandas, numpy, scipy, scikit-learn, quantitative libraries). Database : Hands-on Snowflake and SQL experience with large-scale data environments. Analytics : Statistical modeling, econometrics, and machine learning techniques. Platforms : Bloomberg Terminal, Refinitiv, or equivalent financial data systems. Quantitative Expertise Deep understanding of factor models, portfolio optimization, and systematic risk management. Knowledge of derivatives pricing, fixed income analytics, and alternative investment structures. Experience with market microstructure analysis and high-frequency data processing. Familiarity with performance attribution methodologies and benchmark & Analysis : Strong problem-solving abilities with exceptional attention to detail. Ability to translate quantitative insights into actionable investment recommendations. Excellent presentation skills for communicating complex research to stakeholders. Collaborative approach to working in cross-functional investment teams. Educational Background Master's degree in Finance, Economics, Mathematics, Statistics, Physics, or Engineering. CQF, CFA, FRM or equivalent professional certification preferred. Strong academic foundation with demonstrated quantitative aptitude. Regulatory Awareness Understanding of SEBI AIF regulations and compliance frameworks. Knowledge of investment management risk controls and regulatory reporting requirements. Preferred Skills Industry Recognition : Published quantitative research or contributions to investment thought leadership. Multi-Asset Expertise : Experience across equity, fixed income, commodities, and alternative investments. Innovation Mindset : Interest in machine learning, alternative data, and emerging quantitative techniques. Advanced Programming : Proficiency in additional languages such as R, C++, or Julia; experience with version control (Git) and code optimization techniques. Domain Specialization : Strong background in specific asset classes such as Indian equities & emerging markets. Entrepreneurial Drive : Self-motivated individual comfortable building scalable systems from ground-up in a growing AIF technology environment. Industry Certifications : Additional qualifications or specialized quantitative finance credentials will be a plus. Alternative Data & AI : Experience with NLP and AI techniques for extracting investment signals from alternative text data sources (such as Filings, Analyst Reports and Transcripts) and developing reasoning-based AI models for systematic decision-making will be a plus. Pay range and compensation package : Competitive with industry standards, including performance-based incentives. (ref:hirist.tech)
Posted 5 days ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We’re the team that developed the Mashup Engine (M) and Power Query. We already ship monthly to millions of users across Excel, Power/Pro BI, Flow, and PowerApps; but in many ways we’re just getting started. We’re building new services, experiences, and engine capabilities that will broaden the reach of our technologies to several new areas – data “intelligence”, large-scale data analytics, and automated data integration workflows. We plan to use example-based interaction, machine learning, and innovative visualization to make data access and transformation even more intuitive for non-technical users. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Engine layer: designing and implementing components for dataflow orchestration, distributed querying, query translation, connecting to external data sources, and script parsing/interpretation Service layer: designing and implementing infrastructure for a containerized, micro services based, high throughput architecture UI layer: designing and implementing performant, engaging web user interfaces for datavisualization/exploration/transformation/connectivity and dataflow management Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience Experience in data integration or migrations or ELT or ETL tooling is mandatory Preferred/Additional Qualifications BS degree in Computer Science Engine role: familiarity with data access technologies (e.g. ODBC, JDBC, OLEDB, ADO.Net, OData), query languages (e.g. T-SQL, Spark SQL, Hive, MDX, DAX), query generation/optimization, OLAP UI role: familiarity with JavaScript, TypeScript, CSS, React, Redux, webpack Service role: familiarity with micro-service architectures, Docker, Service Fabric, Azure blobs/tables/databases, high throughput services Full-stack role: a mix of the qualifications for the UX/service/backend roles Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 5 days ago
8.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
About The Job Asymbl is an innovative technology company that combines industry-specific products, digital workforce transformation, and deep Salesforce consulting expertise to drive growth and innovation. We deliver both advanced software and strategic services to help organizations modernize how work gets done. We pride ourselves on a culture of relentless curiosity and belief, grounded in trust and integrity, driven by a bias to action and willingness to fail fast while remaining unwaveringly customer-focused and dedicated to fostering the potential of our people. Position Overview The Senior Salesforce BI & Analytics Architect will lead the design and implementation of robust analytics solutions, leveraging Salesforce Data Cloud (formerly Customer Data Platform) to empower data-driven decision-making. This role focuses on integrating complex data sources, architecting scalable solutions, and delivering actionable insights through Salesforce Tableau CRM, Tableau, and other advanced analytics tools. The ideal candidate combines deep expertise in Salesforce ecosystems, strong business acumen, and the ability to translate data into impactful strategies. Why Join Us ? Join Asymbl to shape the future of data-driven transformation. As a Senior Salesforce BI & Analytics Architect, youll work on challenging projects that leverage Salesforce Data Cloud to deliver next-generation analytics. Be part of a collaborative, innovative team where your expertise will drive real business impact. We offer competitive compensation, professional growth opportunities, and a vibrant company culture that values continuous learning and innovation. Responsibilities Lead the design and architecture of Salesforce analytics solutions, with a focus on Salesforce Data Cloud, Tableau CRM, and Tableau. Integrate and harmonize data from diverse sources, ensuring data quality, consistency, and scalability. Design and implement customer-centric data models, leveraging the capabilities of Salesforce Data Cloud for real-time analytics and insights. Build advanced dashboards, reports, and visualizations that provide actionable insights to business users. Collaborate with business and technical stakeholders to understand reporting and analytics requirements, translating them into scalable solutions. Implement data governance, security, and compliance best practices within the Salesforce ecosystem. Optimize the performance of analytics solutions, ensuring efficient data processing and timely delivery of insights. Provide technical leadership and mentorship to junior architects, developers, and analysts. Stay abreast of emerging trends and innovations in data analytics, ensuring solutions leverage the latest technologies and practices. Qualifications Bachelors degree in Computer Science, Data Analytics, or a related field. Advanced degrees preferred. 8+ years of experience in BI/Analytics architecture, with at least 3 years specializing in Salesforce Data Cloud and analytics tools. Expertise in Salesforce Data Cloud, Tableau CRM (formerly Einstein Analytics), Tableau, and data modeling within the Salesforce ecosystem. Strong knowledge of data integration techniques, ETL processes, and APIs within Salesforce. Proven experience working with large-scale, complex datasets and building real-time analytics solutions. Deep understanding of data governance, security, and compliance standards, especially within Salesforce environments. Hands-on experience with Salesforce Analytics Query Language (SAQL), Tableau Server/Online, and advanced dashboard design. Salesforce certifications such as Tableau CRM & Einstein Discovery Consultant, or Data Architect preferred. Excellent communication and stakeholder management skills, with the ability to present complex data concepts in a clear, concise manner. Familiarity with additional enterprise analytics tools or platforms (e.g., Power BI, Snowflake) is a plus. (ref:hirist.tech)
Posted 5 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Are you a data storyteller who thrives on solving real business challenges with elegant and scalable BI solutions? Do you have an eye for detail, a knack for visualizing insights, and hands-on experience with Power BI and Looker? If yes, we're looking for you! We are on the hunt for a Senior BI Developer who can bring clarity to complexity. You will be a strategic partner in our analytics ecosystem, delivering high-impact dashboards and reports that drive critical decision-making across business functions. What You'll Do Architect & Build : Develop intuitive and visually compelling dashboards using Power BI and Looker, tailored to various business use cases. DAX Wizardry : Write and optimize advanced DAX queries to drive complex metrics and calculations with precision. Cross-Functional Collaboration : Partner with business teams, analysts, and data engineers to understand requirements, model data, and translate them into clear and actionable BI deliverables. Data Integrity Champion : Ensure consistency, accuracy, and performance across BI assets. Own the quality and scalability of dashboards. Insight Generation : Go beyond dashboards-use your analytical mindset to discover trends, anomalies, and opportunities hidden in the data. Process Automation : Identify repetitive processes and automate them using Power Query, LookML, or SQL to improve reporting efficiency. Innovation & Best Practices : Stay abreast of BI trends, recommend tool enhancements, and drive BI maturity within the team. Must-Haves What You Bring to the Table : At least 5 years of relevant BI development experience (total exp : 6+ years preferred) Advanced expertise in Power BI - including Power Query, DAX, and report optimization Strong working knowledge of Looker and LookML Proficient in SQL for data modeling, transformations, and querying large datasets Solid understanding of data warehouse concepts, ETL pipelines, and data architecture Experience handling large-scale datasets and performance tuning Bachelor's Degree in Computer Science, Engineering, or related field Excellent verbal and written communication, with the ability to explain technical concepts to non- technical users Good-to-Haves Exposure to cloud platforms - Azure, GCP, or AWS Familiarity with Agile project management practices Knowledge of Python or R for deeper analytics and custom data wrangling Experience integrating BI tools with source systems and APIs Why You'll Love This Role Work on business-critical projects with full visibility and ownership Join a forward-thinking analytics team where your inputs shape data culture Flexibility through a hybrid model that respects your time and productivity Career growth through exposure to a diverse tech stack and real-time challenges Engage with leaders who appreciate innovation, transparency, and continuous learning (ref:hirist.tech)
Posted 5 days ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Security represents the most critical priorities for our customers in a world awash in digital threats, regulatory scrutiny, and estate complexity. Microsoft Security aspires to make the world a safer place for all. We want to reshape security and empower every user, customer, and developer with a security cloud that protects them with end to end, simplified solutions. The Microsoft Security organization accelerates Microsoft’s mission and bold ambitions to ensure that our company and industry is securing digital technology platforms, devices, and clouds in our customers’ heterogeneous environments, as well as ensuring the security of our own internal estate. Our culture is centered on embracing a growth mindset, a theme of inspiring excellence, and encouraging teams and leaders to bring their best each day. In doing so, we create life-changing innovations that impact billions of lives around the world. We are seeking for a passionate cybersecurity professionals to join our growing team of Defenders. In this role, you will proactively detect, investigate, and respond to advanced threats across enterprise environments using cutting-edge and AI enabled security tools and threat intelligence. The ideal candidate combines strong security expertise with a curious mindset and skills to conduct deep threat analysis. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Monitor, triage, and respond to security incidents using alerts and incidents from Microsoft Defender products (MDE, MDI, MDO, MDA, MDC etc.) Perform proactive threat hunting using hypothesis, and telemetry from endpoints, identities, cloud and network. Develop hunting queries using Kusto Query Language (KQL) or similar to uncover suspicious patterns and behaviors. Investigate security incidents across hybrid environments and contribute to root cause analysis and containment strategies. Collaborate with internal teams (defender, threat intelligence, engineering) to enhance detection logic, develop automations, and improve incident response workflows. Contribute to incident documentation, detection playbooks, and operational runbooks. Stay current with evolving threat landscapes, cloud attack vectors, and advanced persistent threats (APT). Qualifications Graduate degree in engineering or equivalent discipline. 3–7 years of experience in cybersecurity (SOC, IR, Threat Hunting, Red Team). Hands-on experience with SIEM, EDR, and cloud-native security tools (Microsoft XDR, Sentinel, CrowdStrike, etc.). Experience with at least one cloud platform (Azure, AWS, GCP) and its associated security services and configurations. Proficiency in KQL, Python, or similar scripting languages for data analysis and automation. Strong knowledge of MITRE ATT&CK, Cyber Kill Chain, and adversary TTPs. Familiarity with operating system internals (Windows, Linux) and endpoint/network forensics. Certifications like CISSP, OSCP, CEH, GCIH, AZ-500, SC-200 or similar/equivalent are a plus. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 5 days ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Build cloud scale products with focus on efficiency, reliability and security Build and maintain end-to-end Build, Test and Deployment pipelines Deploy and manage massive Hadoop, Spark and other clusters Contribute to the architecture & design of the products Triaging issues and implementing solutions to restore service with minimal disruption to the customer and business. Perform root cause analysis, trend analysis and post-mortems Owning the components and driving them end to end, all the way from gathering requirements, development, testing, deployment to ensuring high quality and availability post deployment Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 4+ years technical engineering experience with coding in languages like C#, React, Redux, TypeScript, JavaScript, Java or Python OR equivalent experience Experience in data integration or data migrations or ELT or ETL tooling is mandatory Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 5 days ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Summary We are seeking a highly skilled .NET Full Stack Developer with 5- 7 years of hands-on experience in designing, developing, and maintaining enterprise-grade applications. The ideal candidate will have deep expertise in the Microsoft technology stack, strong frontend and backend development capabilities, proven experience in database management, and the ability to deliver high-quality, bug-free code under tight Responsibilities : Design, develop, test, and deploy robust .NET applications using C#, ASP.NET, MVC, and .NET Core. Build and integrate web services (SOAP/REST) and third-party APIs/SDKs. Develop responsive, device-independent UIs using HTML, DHTML, CSS, JavaScript, AJAX. Create scalable backend services and business logic. Design and implement interactive reports and dashboards. Optimize SQL queries and stored procedures for maximum performance. Manage and administer MS SQL Server and MySQL databases. Perform database backup, recovery, tuning, and security tasks. Host and configure applications on IIS or Apache web servers. Debug, troubleshoot, and resolve application issues efficiently. Enhance and maintain existing systems by identifying defects and areas of improvement. Ensure high-quality code delivery with minimal bugs. Collaborate with cross-functional teams to define, design, and ship new features. Adhere to project timelines and participate in code Technical Skills : Expert in C#, .NET Framework (3.5+), .NET Core, ASP.NET, MVC. Strong experience with WCF, Web APIs, Web Methods, : Proficient in HTML, CSS, DHTML, JavaScript, AJAX. Experience with UI/UX design and device-responsive : Advanced knowledge of MS SQL Server and MySQL. Strong in DBA tasks, Stored Procedures, Query Optimization, and Performance Tuning. Proficient in Database Design, Backup/Restore, Security & & Reporting : Web service integration using JSON, XML. Experience with third-party APIs/SDKs. Dashboard and report Skills : Good team player with excellent collaboration skills. Strong problem-solving and debugging ability. High commitment to code quality and on-time delivery. Ability to work independently under pressure and tight to Have : Experience in mobile app development (Web, Hybrid, Native). Knowledge of DevOps or CI/CD : Bachelors or Masters Degree in Computer Science, IT, or related field. (ref:hirist.tech)
Posted 5 days ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Selected Intern's Day-to-day Responsibilities Include Developing web dashboard features using HTML, CSS, and PHP. Creating graphs, charts, and tables. Writing optimized code to minimize server query load. About Company: TestRight Nanosystems is developing the world's first pocket molecular sensor which allows you to find out the exact composition of the material around you to know what you are paying for. TestRight Nanosystems is a hardware startup incubated at IIT Delhi and has been awarded at international forums for its cutting-edge technology in optoelectronics.
Posted 5 days ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities The primary responsibility of the supervisor is to ensure that his/her subordinates are developed to become successful in their given or potential role, thus coaching and mentoring is very important. Minimum requirement is to coach your staff at least twice a month for mid and top performer. Bottom performer or your focus people should have minimum of one a week coaching session Conduct daily and weekly huddle to discuss strategy to address performance or challenges, provide update and to drive performance. Agenda should be prepared prior to the meeting and meeting should always be documented Attendance and Schedule Adherence - Ensure that subordinates are reporting on time and on the days that they have shift. The supervisor should be able to drive attendance always including punctuality and break schedule compliance Update all required data as needed based on company requirement (MyGPS, EWS, LH, Allsec, CLL, Peoplesoft, etc) Inventory management including allocation. Ensuring inflow and outflow are properly managed and monitored based on capacity. Provide trends and action plan including burn down plan if sudden surge in volume or decreased capacity due to shrinkage Respond to query and escalations then provide feedback to business partners. Deep dive should be conducted and shared with leaders, TAT 24 hrs. This should include action plan for the specific person/issue and how this will be prevented in the future Ensures that the agents are aware of their performance (daily, weekly and monthly progress). Performance scorecard should be reviewed at least twice a month. So agents are aware of where they stand Review audit markdown and conduct error analysis and process related coaching including root cause analysis Ensure rebuttals are done when necessary and submitted in a timely manner Complete internal audits on a timely manner as deemed required Review adjustment requests and approve those which are valid for write-off Complete your own production required by your processes. Minimum of 40 accounts per month Ensure accurate documentation of coaching sessions provided and attended. This should be uploaded in ORBIT. This includes but not limited to performance, behavior coaching and retention conversations Ensure accuracy of all data and report submitted including End of Day report Update dashboards, trackers, business review files, scorecard in a timely manner, when needed Ensure performance boards are updated daily and agents has visibility on it Attend scheduled meeting, training and calibration sessions Provides purposeful and actionable development feedback to direct reports and monitors to support their performance improvement. If, after the action items of the development plan do not yield the desired positive results in the agreed upon timeframe, a CAP is initiated in accordance to Optum policies and practices Study trends and analysis on team performance and be able to device SMART action plan to address team/ individual challenges Ensure own and subordinate adherence to company policies and procedures. Strict implementation should be reinforced. Issue corrective action as deemed necessary Request needed learning from the trainers or SMEs, Leads and process experts Drive meal adherence 98% (PHL only) and manage utilization at 71.42% for PH and 75.78% for IND the least Provide floor/virtual support to ensure that agents are assisted real time Complete accountability of the team’s performance and actions Create career pathing for the subordinate you manage Create your succession plan Drive compliance and success of Vital signs, Bright ideas, engagement activities and other company/process initiatives Any additional task that maybe required from the process he/she belongs to Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergo process training and pass ramp certification 2+ years of experience in Hospital Revenue Cycle Management 1+ years of management/leadership experience Thorough understanding of insurance policies and procedures Technical knowledge Working knowledge of medical terminology Basic computer skills, must understand Excel Proven excellent written and verbal communication Proven high sense of responsibility and accountability; Takes ownership and initiative Proven excellent communication capability; persuasive, inclusive, and encouraging; the ability to listen and understand; Ability to elicit cooperation from a variety of resources Proven to be adaptable and flexible, with the ability to handle ambiguity and sometimes changing priorities Proven professional demeanor and positive attitude; customer service orientation Proven to possess a personal presence that is characterized by a sense of honesty, integrity, and caring with the ability to inspire and motivate others to promote the philosophy, mission, vision, goals, and values of Optum and our client organization(s) Proven ability to think and act; decisiveness, assertiveness, with ability to achieve results quickly Proven ability to learn, understand, and apply new technologies, methods, and processes Proven ability to recognize necessary changes in priority of tasks and allocation of resources, and bring them to the attention of Optum Leadership, as required Proven ability to be a self-starter and work independently to move projects successfully forward Proven ability to work with a variety of individuals in managerial and staff level positions Demonstrates a positive leadership shadow by shaping positive behaviors in areas of influence, building integrity, influencing our values and creating a healthy, high-performance environment #NTRCM At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 5 days ago
3.0 - 5.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
Remote
Description There's likely a reason you've taken the time out of your busy day to review this opportunity at PulsePoint. Maybe you're in need of a change or there's “an itch you're looking to scratch.” Whatever may be the reason, listen to what some of our team members are saying about working here: "My manager takes the time to not only identify my next career move, but the steps that will take me there. I even have my own personal training budget that I'm encouraged to spend." "Our input is valued and considered. Everyone has a voice and that goes a long way in ensuring that we're moving towards a shared goal." "The Leadership team is incredibly open on their goals and how I contribute to the larger company mission. We all know where we fit and how we can make an impact every day." PulsePoint is growing, and we're looking for a Data Analyst to join our Data Analytics team! A BIT ABOUT US: PulsePoint is a fast-growing healthcare technology company (with adtech roots) using real-time data to transform healthcare. We help brands and agencies interpret the hard-to-read signals across the health journey and unify these digital determinants of health with real-world data to produce the most dimensional view of the customer. Our award-winning advertising platforms use machine learning and programmatic automation to seamlessly activate this data, making marketing, predictive analytics, and decision support easy and instantaneous. The most exciting part about working at PulsePoint is the enormous potential for personal and professional growth. Data Analyst Our Analysts take full ownership of complex data workflows and help drive innovation across PulsePoint's analytics products like Signal and Omnichannel. They build scalable solutions, automate manual processes, and troubleshoot issues across teams. By turning raw data into clear, actionable insights, they support both internal stakeholders and external clients. Data Analysts on the Data Analytics team work closely with Product Managers, BI, Engineering, and Client Teams to deliver high-impact analysis, reporting, and feature enhancements that shape the future of our data and analytics platforms. THE PRODUCT YOU’LL BE WORKING ON: You'll be working on HCP365, the core technology of PulsePoint’s analytics products - Signal, and Omnichannel. HCP365 (the first Signal product) is an AWARD-WINNING product having won a Martech Breakthrough Award and a finalist for the PM360 Trailblazer Award. It's the only health analytics and measurement solution that provides a complete, always-on view of HCP audience engagement across all digital channels with advanced data logic, automation, and integrations. This gives marketers and researchers unprecedented access to data insights and reporting used to inform and optimize investment decisions. You'll be helping scale the platform further with new features, cleaner attribution, and smarter automation to support client growth and product expansion. WHAT YOU'LL BE DOING: This is a hybrid role at the intersection of data analysis, operational support, and technical platform ownership. Your work will directly contribute to the accuracy, scalability, and performance of our attribution and measurement solutions. Take ownership of key workflows like HCP365 and Omnichannel Support Omnichannel at an enterprise level across the whole organization Build and improve Big Query SQL-based pipelines and Airflow DAGs Conduct R&D and contribute towards roadmap planning for new features in product Support client teams with data deep dives, troubleshooting, and ad hoc analysis Translate complex data into simple, client-facing insights and dashboards Dig into data to answer and resolve client questions REQUIRED QUALIFICATIONS: Minimum 3-5 years of relevant experience in: Understanding of deterministic and probabilistic attribution methodologies Proficiency in analyzing multi-device campaign performance and user behavior Excellent problem-solving and data analysis skills. Ability to organize large data sets to answer critical questions, extrapolate trends, and tell a story Writing and debugging complex SQL queries from scratch using real business data Strong understanding of data workflows, joins, deduplication, attribution, and QA Working with Airflow workflows, ETL pipelines, or scheduling tools Proficient in Excel (pivot tables, VLOOKUP, formulas, functions) Understanding of web analytics platforms (Google Analytics, Adobe Analytics, etc.) Experience with at least one BI Software (Tableau, Looker, etc.) Able to work 9am-6pm EST (6:30pm-3:30am IST); we are fine with remote work Note that this role is for India only and we do not plan on transferring hires to the U.S./UK in the future PREFERRED QUALIFICATIONS: Python for automation or workflow logic Basic experience with: Designing data pipelines and optimizing them Working on AI agents, automation tools, or workflow scripting Dashboard design and data storytelling And one of: ELT experience Experience with automation Statistics background Exposure to: Health related datasets, hashed identifiers Workflow optimization or code refactoring Project Management tools like JIRA or Confluence Bonus if you've worked on R&D or helped build data products from scratch WHAT WE'RE LOOKING FOR: We're looking for a hands-on, reliable, and proactive Analyst who can: Jump into complex workflows and own them end-to-end Troubleshoot issues and bring clarity in ambiguous situations Balance between deep technical work and cross-team collaboration Build scalable, automated, and accurate solutions SELECTION PROCESS: Initial Phone Screen SQL Screening Test via CodeSignal (35 minutes) SQL Live Coding Interview (60 minutes) Hiring Manager Interview (30 minutes) Team Interview (1:1s with Sr. Client Analyst, Team Manager, SVP of Data, Product Manager who built Signal) (3 x 45 minutes) RED FLAGS FOR US: Candidates won’t succeed here if they haven’t worked closely with data sets or have simply translated requirements created by others into SQL without a deeper understanding of how the data impacts our business and, in turn, our clients’ success metrics. Watch this video here to learn more about our culture and get a sense of what it’s like to work at PulsePoint! WebMD and its affiliates is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, ancestry, color, religion, sex, gender, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law.
Posted 5 days ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Process Manager - GCP Data Engineer Mumbai/Pune | Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)| Management Level - PM| Travel Requirements - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles And Responsibilities Participate in Stakeholder interviews, workshops, and design reviews to define data models, pipelines, and workflows. Analyse business problems and propose data-driven solutions that meet stakeholder objectives. Experience on working on premise as well as cloud platform (AWS/GCP/Azure) Should have extensive experience in GCP with a strong focus on Big Query, and will be responsible for designing, developing, and maintaining robust data solutions to support analytics and business intelligence needs. (GCP is preferable over AWS & Azure) Design and implement robust data models to efficiently store, organize, and access data for diverse use cases. Design and build robust data pipelines (Informatica / Fivertan / Matillion / Talend) for ingesting, transforming, and integrating data from diverse sources. Implement data processing pipelines using various technologies, including cloud platforms, big data tools, and streaming frameworks (Optional). Develop and implement data quality checks and monitoring systems to ensure data accuracy and consistency. Technical And Functional Skills Bachelor’s Degree with 5+ years of experience with relevant 3+ years hands-on of experience in GCP with BigQuery. Good knowledge of any 1 of the databases scripting platform (Oracle preferable) Work would involve analysis, development of code/pipelines at modular level, reviewing peers code and performing unit testing and owning push to prod activities. With 5+ of work experience and worked as Individual contributor for 5+ years Direct interaction and deep diving with VPs of deployment Should work with cross functional team/ stakeholders Participate in Backlog grooming and prioritizing tasks Worked on Scrum Methodology. GCP certification desired. About EClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About About eClerx Technology eClerx’s Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists’ partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law
Posted 5 days ago
1.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Step into the role of Fraud Analyst at Barclays, where you will play a pivotal role to manage operations within a business area and maintain processes, risk management initiatives and compliance with relevant regulators. You will take ownership of your work, ensuring it aligns with the relevant rules & regulations, and codes of conduct. To be successful as an Fraud Analyst at Barclays, you should have below critical skills. Min 1 years of relevant experience. Graduate/post-graduate in any discipline. Excellent communication skills. Blended Process with outbound calls. Experience in Customer Service and handling fraud related queries in Banking. Experience of working in BPO/KPO in Voice or Blended Process. UK Shift - Flexibility in hours of work and ability to work changing shifts patterns. Deal with customer queries and ensure appropriate resolution is offered to be able to manage customer experience and NPS (Net Promoter Score) Process transactions in accordance to approved process and procedures, international regulations and within pre-agreed service levels and with speed/accuracy Handle customer requests with the support of clearly defined scripts and processes. Adhere to the Service Level Agreements specified by the client / process and ensure adherence to time schedules May have the authority to release and verify funds across a variety of systems used by the department. Deal with customers and may be required to identify, log and escalate complaints and resolve queries by tracking them internally and escalate upwards if necessary Responsible for the management of own daily workload, to ensure telephone enquiries / queries are dealt with in an efficient and knowledgeable manner to meet the demands of the business grades of service and ensuring that all unresolved cases / queries are allocated to the respective areas Responsible for identifying potential loss situations promptly escalating to the relevant areas to minimise the risk to the business and the customer Adhere to organizational wide information security policies and procedures. Assist Team Manager / Process Expert in administration of the section, including organizing workflow, queue management and query resolution Adhere to quality control discipline, procedures and checks at all times Day to day query resolution, with upward escalation of more complex queries to the management team Report issues and concerns as soon as possible to seniors/ team leaders/ managers etc. in time and with complete information, in which effort is made to prevent or limit possible damage/fraud. Responsible for driving own performance management, collating relevant documentation, preparing for and arranging self-performance reviews Thorough solution of queries & identify improvements to processes Maintain effective performance by being adaptable and positive in approach to dynamic business circumstances, questioning procedures and proactively seeking solutions Knowledge of risk awareness, audit disciplines and controls You may be assessed on key essential skills relevant to succeed in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Noida. Purpose of the role To provide exceptional customer service while resolving more complex customer needs/requests. Accountabilities Provision of customer service through various communication channels including chat, email and phone. Execution of customer service requirements to resolve more complex, specific customer needs, and give a unique, personalised resolution for each case. Collaboration with teams across the bank to align and integrate customer care processes. Identification of areas for improvement to provide recommendations for change in customer care processes and provide feedback and coaching for colleagues on these highlighted areas. Development and implementation of customer care procedures and controls to mitigate risks and maintain efficient operations. Resolution of specific customer inquiries and issues related to the bank’s products and service, including account balances, transactions and payments. Development and execution of reports and presentations on customer care performance and communicate findings to internal senior stakeholders. Identification of industry trends and developments to implement best practice to improve customer care efficiency and effectiveness. Analyst Expectations To meet the needs of stakeholders/ customers through operational excellence and customer service Perform prescribed activities in a timely manner and to a high standard No people leadership roles at this grade. Execute work requirements as identified in processes and procedures, collaborating with and impacting on the work of team members. Identify escalation of policy breaches as required. Take responsibility for customer service and operational execution tasks. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Gain and maintain an understanding of own role, how the team integrates to achieve overall objectives, alongside knowledge of the work of other teams within the function. Work within well-defined procedures that may involve a variety of work routines. Demonstrate an understanding of the procedures. Evaluate and select the appropriate alternatives from defined options. Make judgements based on the analysis of factual information. Build relationships with stakeholders and customers to identify and address their needs, in support of a smooth operating process, handling sensitive issues as required. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.
Posted 6 days ago
60.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Business Control Officer Control Governance, AVP Corporate Title: Assistant Vice President Location: Pune, India Role Description About DWS Group DWS Group (DWS) is one of the world's leading asset managers with EUR 1,010 bn of assets under management (as of 31 March 2025). Building on more than 60 years of experience and a reputation for excellence in Germany and across Europe, DWS has come to be recognized by clients globally as a trusted source for integrated investment solutions, stability, and innovation across a full spectrum of investment disciplines. We offer individuals and institutions access to our strong investment capabilities across all major asset classes and solutions aligned to growth trends. Our diverse expertise in Active, Passive and Alternatives asset management –as well as our deep environmental, social and governance focus –complement each other when creating targeted solutions for our clients. Our expertise and on-the-ground-knowledge of our economists, research analysts and investment professionals are brought together in one consistent global CIO View, which guides our investment approach strategically. DWS wants to innovate and shape the future of investing: with approximately 3,500 employees in offices all over the world, we are local while being one global team. We are investors –entrusted to build the best foundation for our clients’ future. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities This specific role is to act as Business Control Officer with responsibility for governing and enhancing the DWS Control Inventory. This role will support India DCO office with activities as outlined below: Support in governing the DWS Control Inventory, including identifying, analysing and reporting data quality issues or gaps in documentation Support in migrating control inventory to strategic technology platform Monitor and understand Changes made across DB group Control Inventory, to reflect where relevant into DWS Control Inventory Support execution of monthly controls governance forum Assess end to end business processes to identify significant gaps and determine issue root causes. Apply critical thinking skills to substantive testing techniques to thoroughly evaluate the effectiveness of high-risk business processes. Collaborate with cross-functional teams and stakeholders to support control design and effectiveness. Your Skills And Experience Bachelor's degree in information security or related field required, with a preference towards master's degree. Demonstrated ability to analyse complex issues, develop and implement risk mitigation strategies, and communicate effectively with senior stakeholders. Proficient knowledge of risk management frameworks, regulations, and industry best practices Strong experience in Risk & Control Management domain Experience and proficiency in managing voluminous spreadsheets, power query and associated technical skills At least 5 years’ experience in banking or asset management Knowledge of Risk & Control management workflow suites or related tools/platforms, specific experience in this regard will be preferential Knowledge of Control Metrics & Assessment/Assurance Methodologies, specific experience in this regard will be preferential Should possess strong communication skills (written/ spoken) Should be skilled to work with minimal supervision. Strong analytical and strategic mindset along with the ability to collaborate with different stakeholders including top management representatives. How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 6 days ago
60.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Senior Business Control Officer Corporate Title: Assistant Vice President Location: Pune, India Role Description DWS Group (DWS) is one of the world's leading asset managers with EUR 1,010 bn of assets under management (as of 31 March 2025). Building on more than 60 years of experience and a reputation for excellence in Germany and across Europe, DWS has come to be recognized by clients globally as a trusted source for integrated investment solutions, stability and innovation across a full spectrum of investment disciplines. We offer individuals and institutions access to our strong investment capabilities across all major asset classes and solutions aligned to growth trends. Our diverse expertise in Active, Passive and Alternatives asset management –as well as our deep environmental, social and governance focus –complement each other when creating targeted solutions for our clients. Our expertise and on-the-ground-knowledge of our economists, research analysts and investment professionals are brought together in one consistent global CIO View, which guides our investment approach strategically. DWS wants to innovate and shape the future of investing: with approximately 3,500 employees in offices all over the world, we are local while being one global team. We are investors –entrusted to build the best foundation for our clients’ future. Pls update as per latest info on file. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities This role will support Global DCO activities as outlined below: Execute DWS’ Findings Reporting process, with select deliverables to Senior Management (Executive Board, Risk and Control Committee, Operating Committee, and certain Deutsche Bank committees) on a regular basis Continuously improve underlying reporting processes, driving quality end-to-end and increasing the usefulness to our stakeholders further Own and improve Forecasting methodology, closely working with Finding Owners and Divisional COO teams to understand the development of DWS’ risk profile in the area of Findings Partner with Risk Assessment and Control Monitoring teams to identify key risk and control indicators Your Skills And Experience Graduate with strong academic background and relevant experience. Strong English communication skills (Oral and Written) Excellent Microsoft Office (Excel, Word, PowerPoint incl. Think-cell) capabilities, with proven track record in automation of reporting tasks in Power Query, Power Automate, Qlik, or Tableau Good understanding of overall Deutsche Bank / DWS Risk environment Ability to co-ordinate with Global Management teams Ability to independently pursue individual tasks to full completion. Perseverance and accuracy is required How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France