Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
5 - 8 Lacs
Noida
Work from Office
Position: Lead Finance FP&A / R2R / Financial Analyst Experience: 5-8 years Location: Noida Shift: General (Can be rotational) Communication: Excellent verbal and written skills required Role & Responsibilities: Manage a small team to ensure smooth operations Prepare and maintain reports and dashboards, including profitability analysis Automate reports and routine tasks for better efficiency Calculate commissions and bonuses, and resolve related queries Track project costs and monitor budgets Assist in surveys and respond to RFPs Create data models using BI tools Develop financial models like budgets and forecasts Preferred Background: Experience in FP&A, R2R process, or as a Financial Analyst If interested, please share your resume at Parul.singh1@artech.com Regards, Parul Singh
Posted 1 week ago
5.0 - 10.0 years
19 - 20 Lacs
Bengaluru
Remote
Hi Candidates, we have job openings in one of our MNC Company Interested candidates can apply here and share details to chandrakala.c@i-q.co Note: NP-0-15 days only serving Role & responsibilities We are looking for Data Managers Work Exp: Min 5 yrs. (mandatory) Location: Remote (India) JD: The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. The successful candidate will: - Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). - Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities - Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). - Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models - Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. - Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. - Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. - Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills - Bachelors or masters degree in computer/data science technical or related experience. - 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). - Experience with data warehouse, data lake, and enterprise big data platforms in multi-datacenter contexts required. -Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. - Experience in team management, communication, and presentation. Preferred candidate profile
Posted 1 week ago
5.0 - 9.0 years
14 - 24 Lacs
Hyderabad
Hybrid
Experience:- Required: Bachelors degree in computer science or engineering. 7+ years of experience with data analytics, data modeling, and database design. 5+ years of experience with Vertica. 2+ years of coding and scripting (Python, Java, Scala) and design experience. 2+ years of experience with Airflow. Experience with ELT methodologies and tools. Experience with GitHub. Expertise in tuning and troubleshooting SQL. Strong data integrity, analytical and multitasking skills. Excellent communication, problem solving, organizational and analytical skills. Able to work independently. Additional / pAdditional/preferred skills: Familiar with agile project delivery process. Knowledge of SQL and use in data access and analysis. Ability to manage diverse projects impacting multiple roles and processes. Able to troubleshoot problem areas and identify data gaps and isissuessuein s. Ability to adapt to fast changing environment. Experience designing and implementing automated ETL processes. Experience with MicroStrategy reporting tool.
Posted 1 week ago
12.0 - 22.0 years
25 - 32 Lacs
Chennai, Bengaluru
Hybrid
Technical Manager, Business Intelligence Location : Chennai/Bangalore Experience : 15+ Years Employment Type : Full Time Role Description: We are seeking an experienced Technical Manager to lead our Business Intelligence function. This role is crucial for transforming raw data into actionable insights that drive strategic decision-making. The ideal candidate will be a thought leader in BI, adept at guiding a team, collaborating with stakeholders to understand business requirements, and leveraging advanced BI tools and methodologies to deliver impactful dashboards, reports, and analytical solutions. Responsibilities Drive the vision and strategy for Business Intelligence, promoting data-driven decision-making across the organization. Lead, mentor, and develop a team of BI developers and analysts, fostering expertise in data visualization, reporting, and analytical best practices. Oversee the design, development, and deployment of interactive dashboards, reports, and analytical applications that meet diverse business needs. Ensure that insights are presented clearly, concisely, and compellingly to various audiences, enabling effective business action. Work closely with pre-sales, sales, marketing, Data Engineering, Data Science, and other departments to identify key performance indicators (KPIs), define reporting requirements, and support data-driven initiatives. Collaborate with Data Engineering to ensure data accuracy, consistency, and reliability within BI solutions. Evaluate and recommend new BI tools, techniques, and platforms to enhance reporting capabilities and user experience. Tools & Technologies BI Platforms : Tableau, Power BI, Qlik Sense, Looker, DOMO Data Warehousing/Lakes : Snowflake, Google BigQuery, Amazon Redshift, MS Fabric SQL Databases : PostgreSQL, MySQL, SQL Server, Oracle. Data Modeling : Star Schema, Snowflake Schema, Data Vault. ETL/ELT Concepts : Understanding of data extraction, transformation, and loading processes. Programming Languages : SQL (advanced), Python (for data manipulation/analysis), R. Cloud Platforms : Experience with BI services on AWS, Azure, or GCP. Data Governance Tools : Collibra, MS Purview. Version Control: Git.
Posted 1 week ago
12.0 - 22.0 years
25 - 32 Lacs
Chennai, Bengaluru
Work from Office
Technical Manager, Data Engineering Location : Chennai/Bangalore Experience : 15+ Years Employment Type : Full Time Role Description: We are looking for a seasoned Technical Manager to lead our Data Engineering function. This role demands a deep understanding of data architecture, pipeline development, and data infrastructure. The ideal candidate will be a thought leader in the data engineering space, capable of guiding and mentoring a team, collaborating effectively with various business units, and driving the adoption of cutting-edge tools and technologies to build robust, scalable, and efficient data solutions. Responsibilities Define and champion the strategic direction for data engineering, staying abreast of industry trends and emerging technologies. Lead, mentor, and develop a high-performing team of data engineers, fostering a culture of technical excellence, innovation, and continuous learning. Design, implement, and maintain scalable, reliable, and secure data pipelines and infrastructure. Ensure data quality, integrity, and accessibility. Oversee the end-to-end delivery of data engineering projects, ensuring timely completion, adherence to best practices, and alignment with business objectives. Partner closely with pre-sales, sales, marketing, Business Intelligence, Data Science, and other departments to understand data needs, propose solutions, and support resource deployment for active data projects. Evaluate, recommend, and implement new data engineering tools, platforms, and methodologies to enhance capabilities and efficiency. Identify and address performance bottlenecks in data systems, ensuring optimal data processing and storage. Tools & Technologies Cloud Platforms : AWS (S3, Glue, EMR, Redshift, Athena, Lambda, Kinesis), Azure (Data Lake Storage, Data Factory, Databricks, Synapse Analytics), Google Cloud Platform (Cloud Storage, Dataflow, Dataproc, BigQuery). Big Data Frameworks : Apache Spark, Apache Flink, Apache Kafka, HDFS Data Warehousing/Lakes: Snowflake, Databricks Lakehouse, Google BigQuery, Amazon Redshift, Azure Synapse Analytics. ETL/ELT Tools : Apache Airflow, Talend, Informatica, DBT, Fivetran, Stitch. Data Modeling : Star Schema, Snowflake Schema, Data Vault. Databases : PostgreSQL, MySQL, MongoDB, Cassandra, DynamoDB. Programming Languages: Python (Pandas, PySpark), Scala, Java. Containerization/Orchestration : Docker, Kubernetes. Version Control : Git.
Posted 1 week ago
7.0 - 12.0 years
22 - 27 Lacs
Hyderabad, Pune, Mumbai (All Areas)
Work from Office
Job Description - Snowflake Developer Experience: 7+ years Location: India, Hybrid Employment Type: Full-time Job Summary We are looking for a Snowflake Developer with 7+ years of experience to design, develop, and maintain our Snowflake data platform. The ideal candidate will have strong expertise in Snowflake SQL, data modeling, and ETL/ELT processes to build efficient and scalable data solutions. Key Responsibilities 1. Snowflake Development & Implementation Design and develop Snowflake databases, schemas, tables, and views Write and optimize complex SQL queries, stored procedures, and UDFs Implement Snowflake features (Time Travel, Zero-Copy Cloning, Streams & Tasks) Manage virtual warehouses, resource monitors, and cost optimization 2. Data Pipeline & Integration Build and maintain ETL/ELT pipelines using Snowflake and tools like Snowpark, Python, or Spark Integrate Snowflake with cloud storage (S3, Blob Storage) and data sources (APIs) Develop data ingestion processes (batch and real-time) using Snowpipe 3. Performance Tuning & Optimization Optimize query performance through clustering, partitioning, and indexing Monitor and troubleshoot data pipelines and warehouse performance Implement caching strategies and materialized views for faster analytics 4. Data Modeling & Governance Design star schema, snowflake schema, and normalized data models Implement data security (RBAC, dynamic data masking, row-level security) Ensure data quality, documentation, and metadata management 5. Collaboration & Support Work with analysts, BI teams, and business users to deliver data solutions Document technical specifications and data flows Provide support and troubleshooting for Snowflake-related issues Required Skills & Qualifications 7+ years in database development, data warehousing, or ETL 3+ years of hands-on Snowflake development experience Strong SQL and scripting (Python, Bash) skills Experience with Snowflake utilities (SnowSQL, Snowsight) Knowledge of cloud platforms (AWS, Azure) and data integration tools SnowPro Core Certification (preferred but not required) Experience with Coalesce DBT , Airflow, or other data orchestration tools Familiarity with CI/CD pipelines and DevOps practices Knowledge of data visualization tools (Power BI, Tableau)
Posted 1 week ago
7.0 - 9.0 years
20 - 25 Lacs
Hyderabad
Work from Office
At YASH, we re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Tableau Professionals in the following areas : Experience 7-9 Years Prepare required data model in tableau from the source files Build the required dashboard based on the wireframe designed. Expertise in Tableau dashboard development Expert in Tableau data model setup Strong experience Sql Ensure compliance with data governance and security policies. Work closely with business and dev teams to translate the business/functional requirements into technical specifications that drive Big Data solutions to meet functional requirements. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role Summary We are seeking a strategic and hands-on Senior Manager to lead our Fraud Strategy function. This person will own the development and performance of fraud models and scorecards, work closely with cross-functional partners to define fraud policies, and ensure effective collaboration with our fraud operations team. The ideal candidate combines strong analytical skills, a solid understanding of fraud typologies in consumer lending, and a pragmatic approach to implementation. Key Responsibilities Lead fraud strategy for the lending portfolio, balancing risk mitigation with customer experience and approval rates Develop and maintain fraud detection models and anomaly detection systems using internal and third-party data. Manage fraud scorecard performance and recommend model improvements or policy changes as needed. Collaborate with fraud operations to monitor real-time fraud trends and adapt policies dynamically. Partner with product, data engineering, and external vendors to evolve fraud detection infrastructure. Provide regular fraud performance reporting and deep-dives for senior leadership. Act as subject matter expert on fraud data, model outputs, and cross-channel vulnerabilities. Qualifications 5-8 years of experience in fraud strategy, analytics, or credit risk in financial services or fintech. Strong understanding of identity fraud, synthetic fraud, first-party fraud, and third-party fraud patterns. Hands-on experience with fraud models, machine learning tools, and scorecard management. Advanced proficiency in SQL and Python for data analysis and modeling. Experience working with third-party fraud data providers and integrating fraud rules or signals into decision engines. Ability to communicate insights and recommendations clearly to technical and non-technical stakeholders. Exposure to US consumer lending regulations and risk management practices preferred.
Posted 1 week ago
12.0 - 16.0 years
30 - 45 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities Title: Data Architect Enterprise Data Management Experience: 12+ years No. of Positions: 1 Location: Delhi NCR, Bangalore, Pune (Hybrid) Job Summary: We are seeking a seasoned Data Architect with 12+ years of experience in designing scalable, metadata-driven data architectures. This individual will lead initiatives spanning enterprise metadata management, federated query design, real-time data integration, and semantic layer modelling. The ideal candidate will be hands-on, able to translate complex requirements into robust solutions, and collaborate closely with business, technology, and governance stakeholders. Strong communication and documentation skills are essential, as this role operates at the intersection of data strategy, engineering, and enterprise governance. Must Have Skills: • 12+ years of experience in data architecture and engineering, with deep expertise in metadata-driven, federated, and real-time data environments • Core Competencies o Enterprise Metadata Management: Design and implementation of automated data discovery, cataloguing, and lineage tracking across heterogeneous data platforms o Federated Query Architecture: Building unified data access layers that abstract complexity across multiple data sources and query engines o Real-time Data Integration: Event-driven architectures for continuous metadata synchronization and schema evolution management o Data Governance Frameworks: Establishing automated data quality, privacy compliance, and access control patterns at enterprise scale o Semantic Layer Design: Creating business-friendly data models that bridge technical schemas with analytical requirements • Technical Proficiencies o Programming: Python (data engineering libraries), SQL (advanced optimization), Scala/Java o Data Modelling: Dimensional modelling, graph databases, semantic web technologies o Search & Discovery: Full-text search engines, vector similarity search, ML-based data classification o API Architecture: REST, GraphQL, and gRPC for data service exposure o Streaming Platforms: Message queuing and event streaming architectures • Track record of effective collaboration with Data Engineers, Governance Leads, BI/Analytics Developers, ML Engineers, and Product Owners on complex data initiatives • Demonstrated ability to produce data architecture diagrams, lineage flows, and maintain high-quality documentation standards • Excellent written and verbal communication skills, with the ability to interact with executive sponsors, technology teams, and governance stakeholders • Self-driven, hands-on architect with a lead-by-doing mindset for solution validation, issue resolution, and cross-team enablement Nice to Have Skills: • Implementation experience with enterprise knowledge graphs • Understanding of data mesh and data fabric architectural approaches • Experience with MLOps environments and integration of feature stores • Execution of multi-cloud data strategies (e.g., AWS, Azure, GCP) • Exposure to vector search, ML-based classification, and automated data discovery • Familiarity with full-text search engines and search-driven metadata environments Role & Responsibilities: • Architect, implement, and evolve the enterprise metadata and data architecture to enable discovery, quality, and governance at scale • Lead the design of a federated query layer that abstracts data access across distributed platforms and technologies • Define and implement semantic layers for business-friendly data modeling and reporting enablement • Develop and enforce data governance rules via architectural automation and controls • Collaborate across functions with Data Engineers, Analytics Teams, Governance Stakeholders, UI/UX Designers, and ML Engineering teams to ensure architectural alignment and delivery • Enable real-time metadata synchronization, schema tracking, and classification pipelines • Create and maintain data architecture documentation, lineage maps, and solution artifacts • Support strategic initiatives involving data mesh, knowledge graph, MLOps, and cloud-native data ecosystems • Drive solutioning, reviews, and standards as a technical advisor and hands-on architect • Communicate architectural vision, design decisions, and roadmap updates with executives, product owners, and technical teams Key Skills: Metadata management, Data architecture, Semantic modeling, Federated queries, Real-time integration, Data governance, Python, SQL, Scala, Graph modeling, GraphQL, Kafka, API design, Data mesh, Knowledge graph Education: • Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field • Certifications in Cloud Architecture, Metadata Management, or Data Modeling (e.g., DAMA, DCAM, TOGAF) preferred Preferred candidate profile
Posted 1 week ago
1.0 - 5.0 years
3 - 7 Lacs
Gurugram
Work from Office
Coordintation with Engineering, Service and SMC for oils and paints, field sales (Parts, Accessories Team) Coordination with oil and paint vendors, AIS 140 for regular activities for increasing sales, understading best practices etc Preperation and Management of Sr.Management Decks Handling Suzuki Connect complaints and resolution Developing Strategies to generate additional revenue through Data analytics & Business Insights Coordination with field team for query handling and resolution. Sales Forecasting and Identifying sales trends to improve process efficiency Managing periodic Audits being conducted and annual budget coordination. Strong knowledge of Channel Management - Dealers & Distribution Proficiency in MS Excel and Data modelling Knowledge of Power BI is preferred Data Analysis and Data Visualization with ability to handle large data sets Strong Interpersonal Skills & collaborative approach Key Account Mangement Skills also preferred
Posted 1 week ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Job Overview: We are seeking a skilled DataOps Engineer with a strong foundation in DevOps practices and Data Engineering principles. The ideal candidate will be responsible for ensuring smooth deployment, observability, and performance optimization of data pipelines and platforms. You will work at the intersection of software engineering, DevOps, and data engineering bridging gaps between development, operations, and data teams. Key Responsibilities: Design, implement, and manage CI/CD pipelines using tools such as Jenkins, Git, and Terraform. Manage and maintain Kubernetes (K8s) clusters for scalable and resilient data infrastructure. Develop and maintain observability tools and dashboards (e.g., Prometheus, Grafana, ELK stack) for monitoring pipeline and platform health. Automate infrastructure provisioning and deployments using Infrastructure as Code (IaC) tools, preferably Terraform. Collaborate with data engineers to debug, optimize, and track performance of data pipelines (e.g., Airflow, Airbyte, etc.). Implement and monitor data quality, lineage, and orchestration workflows. Develop custom scripts and tools in Python to enhance pipeline reliability and automation. Work closely with data teams to manage and optimize Snowflake environments, focusing on performance tuning and cost efficiency. Ensure compliance with security, scalability, and operational best practices across the data platform. Act as a liaison between development and operations to maintain SLAs for data availability and reliability. Required Skills & Experience: 4-8 years of experience in DevOps / DataOps / Platform Engineering roles. Proficient in managing Kubernetes clusters and associated tooling (Helm, Kustomize, etc.). Hands-on experience with CI/CD pipelines, especially using Jenkins, GitOps, and automated testing frameworks. Strong scripting and automation skills in Python. Experience with workflow orchestration tools like Apache Airflow and data ingestion tools like Airbyte. Solid experience with Infrastructure as Code tools, preferably Terraform. Familiarity with observability and monitoring tools such as Prometheus, Grafana, Datadog, or New Relic. Working knowledge of data platforms, particularly Snowflake, including query performance tuning and monitoring. Strong debugging and problem-solving skills, especially in production data pipeline scenarios. Excellent communication skills and ability to collaborate across engineering, operations, and analytics teams. Preferred Qualifications: Experience with cloud platforms (AWS, and/or GCP) and cloud-native DevOps practices. Familiarity with data cataloging and lineage tools. Exposure to container security, policy management, and data governance tools. Background in data modeling, SQL optimization, or data warehousing concepts is a plus.
Posted 1 week ago
6.0 - 8.0 years
8 - 10 Lacs
Hyderabad
Work from Office
Position: Sr. BI Developer Work Location: Hyderabad Mode of work - Hybrid Experience - 6 to 8 years Summary: We are seeking a skilled Senior Spotfire Developer with 6 - 8 years of experience to join our analytics team. The ideal candidate will bring expertise in TIBCO Spotfire and a strong foundation in business intelligence and data visualization. This role involves developing, optimizing, and supporting interactive dashboards and reports that provide key insights to support data-driven decision-making. You will work closely with cross-functional teams, including data analysts, engineers, and business stakeholders, to deliver impactful solutions that meet business objectives. Key Responsibilities: Spotfire Development and Customization: Design, develop, and deploy Spotfire applications, dashboards, and reports to support various business units in data-driven initiatives. Requirement Analysis: Collaborate with stakeholders to gather and understand requirements, translating them into technical solutions within Spotfire. Data Integration and Transformation: Use data blending and transformation techniques to prepare data for analysis in Spotfire, ensuring quality and integrity. Optimize Performance: Implement best practices for data loading, caching, and optimization to ensure responsive and efficient dashboards. Customization and Scripting: Enhance Spotfire functionality through scripting (IronPython, JavaScript) and integrate R or Python when needed for advanced analytics. Documentation and Support: Maintain documentation for dashboards and reports and provide support to users, addressing any technical or functional issues. Qualifications: Education: Bachelor s degree in Computer Science, Data Analytics, Information Systems, or a related field. Experience: 4 6 years in BI development, with 4+ years specifically in TIBCO Spotfire Technical Skills: Proficiency in TIBCO Spotfire, including visualization techniques and dashboard configuration. Strong SQL skills and experience with data modeling and data blending. Scripting experience in IronPython and/or JavaScript; knowledge of R or Python for advanced Spotfire functionalities. Familiarity with data integration tools such as Informatica, Alteryx, or equivalent. Analytical Skills: Ability to interpret complex data sets and create visually appealing, user-friendly dashboards. Soft Skills: Strong communication and interpersonal skills with the ability to work collaboratively in a team setting. Preferred Skills: Experience with other BI tools (e.g., Power BI, Tableau) is a plus. Understanding of machine learning and predictive analytics in a BI context. Exposure to cloud platforms like AWS, Azure, or Google Cloud.
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
About Us Observe.AI is transforming customer service with AI agents that speak, think, and act like your best human agents helping enterprises automate routine customer calls and workflows, support agents in real time, and uncover powerful insights from every interaction. With Observe.AI , businesses boost automation, deliver faster, more consistent 24/7 service and build stronger customer loyalty. Trusted by brands like Accolade, Prudential, Concentrix, Cox Automotive, and Included Health, Observe.AI is redefining how businesses connect with customers driving better experiences and lasting relationships at every touchpoint. The Opportunity We are looking for a Senior Data Engineer with strong hands-on experience in building scalable data pipelines and real-time processing systems. You will be part of a high-impact team focused on modernizing our data architecture, enabling self-serve analytics, and delivering high-quality data products. This role is ideal for engineers who love solving complex data challenges, have a growth mindset, and are excited to work on both batch and streaming systems. What you ll be doing: Build and maintain real-time and batch data pipelines using tools like Kafka, Spark, and Airflow. Contribute to the development of a scalable LakeHouse architecture using modern data formats such as Delta Lake, Hudi, or Iceberg. Optimize data ingestion and transformation workflows across cloud platforms (AWS, GCP, or Azure). Collaborate with Analytics and Product teams to deliver data models, marts, and dashboards that drive business insights. Support data quality, lineage, and observability using modern practices and tools. Participate in Agile processes (Sprint Planning, Reviews) and contribute to team knowledge sharing and documentation. Contribute to building data products for inbound (ingestion) and outbound (consumption) use cases across the organization. Who you are: 5-8 years of experience in data engineering or backend systems with a focus on large-scale data pipelines. Hands-on experience with streaming platforms (e.g., Kafka) and distributed processing tools (e.g., Spark or Flink). Working knowledge of LakeHouse formats (Delta/Hudi/Iceberg) and columnar storage like Parquet. Proficient in building pipelines on AWS, GCP, or Azure using managed services and cloud-native tools. Experience in Airflow or similar orchestration platforms. Strong in data modeling and optimizing data warehouses like Redshift, BigQuery, or Snowflake. Exposure to real-time OLAP tools like ClickHouse, Druid, or Pinot. Familiarity with observability tools such as Grafana, Prometheus, or Loki. Some experience integrating data with MLOps tools like MLflow, SageMaker, or Kubeflow. Ability to work with Agile practices using JIRA, Confluence, and participating in engineering ceremonies. Compensation, Benefits and Perks Excellent medical insurance options and free online doctor consultations Yearly privilege and sick leaves as per Karnataka S&E Act Generous holidays (National and Festive) recognition and parental leave policies Learning & Development fund to support your continuous learning journey and professional development Fun events to build culture across the organization Flexible benefit plans for tax exemptions (i.e. Meal card, PF, etc.) Our Commitment to Inclusion and Belonging Observe.AI is an Equal Employment Opportunity employer that proudly pursues and hires a diverse workforce. Observe AI does not make hiring or employment decisions on the basis of race, color, religion or religious belief, ethnic or national origin, nationality, sex, gender, gender identity, sexual orientation, disability, age, military or veteran status, or any other basis protected by applicable local, state, or federal laws or prohibited by Company policy. Observe.AI also strives for a healthy and safe workplace and strictly prohibits harassment of any kind. We welcome all people. We celebrate diversity of all kinds and are committed to creating an inclusive culture built on a foundation of respect for all individuals. We seek to hire, develop, and retain talented people from all backgrounds. Individuals from non-traditional backgrounds, historically marginalized or underrepresented groups are strongly encouraged to apply. If you are ambitious, make an impact wherever you go, and youre ready to shape the future of Observe.AI, we encourage you to apply. For more information, visit www.observe.ai .
Posted 1 week ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Manager, Data Engineer Based in Hyderabad, join a global healthcare biopharma company and be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of the company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers . Role Overview: For the Data Engineer role, we are looking for professional with experience in designing, developing, and maintaining data pipelines. We intend to make data reliable, governed, secure and available for analytics within the organization. As part of a team this role will be responsible for data management with a broad range of activities like data ingestion to cloud data lakes and warehouses, quality control, metadata management and orchestration of machine learning models. We are also forward looking and plan to bring innovations like data mesh and data fabric into our ecosystem of tools and processes. What will you do in this role: Design, develop and maintain data pipelines to extract data from a variety of sources and populate data lake and data warehouse. Develop the various data transformation rules and data modeling capabilities. Collaborate with Data Analyst, Data Scientists, Machine Learning Engineers to identify and transform data for ingestion, exploration, and modeling. Work with data governance team and implement data quality checks and maintain data catalogs. Use Orchestration, logging, and monitoring tools to build resilient pipelines. Use test driven development methodology when building ELT/ETL pipelines. Develop pipelines to ingest data into cloud data warehouses. Analyze data using SQL. Use serverless AWS services like Glue, Lambda, Step Functions Use Terraform Code to deploy on AWS. Containerize Python code using Docker. Use Git for version control and understand various branching strategies. Build pipelines to work with large datasets using PySpark Develop proof of concepts using Jupyter Notebooks Work as part of an agile team. Create technical documentation as needed. What Should you have: 4-8 years of relevant experience Good experience with AWS services like S3, ECS, Fargate , Glue, Any AWS developer or architect certification Agile development methodology Step Functions , CloudWatch, Lambda, EMR SQL Proficient in Python, PySpark Good with Git, Docker, Terraform Ability to work in cross functional teams Bachelor s Degree or equivalent experience in a relevant field such as Engineering (preferably computer eng ineers .), Computer Scienc e Our t echnology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation . Who we are: What we look for: Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, S you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Agile Application Development, Agile Application Development, Agile Methodology, Branching Strategy, Business, Business Intelligence (BI), Business Partnerships, Computer Science, Database Administration, Data Engineering, Data Management, Data Modeling, Data Pipelines, Data Quality Control, Data Visualization, Design Applications, Digital Transformation, Information Management, Information Technology Operations, IT Operation, Management Process, Social Collaboration, Software Development, Software Development Life Cycle (SDLC), SQL Data Analysis {+ 1 more} Preferred Skills: Job Posting End Date: 07/31/2025 *A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date.
Posted 1 week ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
":" As a Senior Software Engineer, you will play a key role in designing, developing, and maintaining complex software systems. You will drive technical initiatives, and mentor junior engineers. Your expertise will be instrumental in ensuring high-quality, scalable, and performant solutions that align with the companys architectural goals and business needs. You will contribute to technical strategy, architectural decisions, and process improvements, while fostering a culture of innovation, collaboration, and engineering excellence. Key Outcomes/Objectives: Design and implement robust, scalable, and high-performance software architectures. Lead and mentor junior engineers, fostering a culture of technical excellence and continuous learning. Ensure code quality, adherence to coding standards, and best practices across the team, acting as a champion for engineering rigor. Drive the resolution of complex technical challenges and contribute to the development of innovative solutions, leading the way in overcoming technical obstacles. Contribute to the development of technical roadmaps and strategic plans, influencing the future direction of the product/sub-product. Core Responsibilities: Technical Leadership and Architecture: Design and implement complex software components and features with a focus on scalability, performance, and maintainability. Contribute to sub-product or feature-level architectural decisions, ensuring alignment with overall system architecture. Lead technical discussions within the team, influencing design choices and engineering practices. Identify and mitigate technical risks early in the development lifecycle. Evaluate and recommend new technologies, frameworks, and tools to improve development efficiency. Code Development and Quality Assurance: Write clean, efficient, and well-documented code that adheres to coding standards and best practices . Lead code reviews and ensure adherence to quality standards across the team . Develop and maintain automated tests (unit, integration, and end-to-end) to improve software reliability. Identify and resolve performance bottlenecks, scalability issues, and technical debt. Mentorship and Team Collaboration: Mentor and guide junior engineers in technical development, best practices, and problem-solving. Lead technical discussions and knowledge-sharing sessions within the team, fostering a culture of continuous learning and collaboration. Be an active contributor in your Community of Practice: You play an active role in the OVO Engineering community on all things related to engineering, sharing practices and offering firsthand experience to the wider community Project Execution and Agile Practices: Participate in sprint planning, backlog refinement, and daily stand-ups to ensure timely and efficient delivery. Break down complex projects into well-defined, executable tasks and contribute to sprint commitments. Monitor delivery progress and technical dependencies, proactively resolving potential blockers. Contribute to technical roadmaps and long-term engineering strategies for sub-products and features. Documentation and Knowledge Sharing: Create and maintain technical documentation, including architecture diagrams, design documents, and API specifications. Share knowledge and expertise through presentations, workshops, and documentation. Contribute to the development of internal tools and libraries. Community of Practice : Contribute to the appropriate Community of Practice (CoP) for your role by leading discussions, sharing practices, offering firsthand experience to the wider community, engaging in knowledge exchange / cross-pollination to further your craft. Create content and and individually contribute to the stated successful outcomes for this CoP Qualifications: Education / Experience : Bachelors or Master\u2019s degree in a technical field or equivalent qualifications, or substantial industry experience demonstrating comparable expertise 5-8 years of hands-on software development experience with a strong track record of delivering high-quality code. Committed to technical excellence and clean code, with the ability to work in Agile, Lean software teams Proven experience in designing and implementing complex software architectures. Experience leading technical initiatives and mentoring junior engineers Ability to thrive in high-ownership environments Skills: Strong proficiency in multiple programming languages, including Node.js, Python, TypeScript, JavaScript, React Native, and React.js, with a focus on building and maintaining microservices-based architectures. Equivalent experience with related technologies and frameworks will also be considered. Deep understanding of software architecture, design patterns, and distributed systems. Experience with cloud platforms such as GCP and AWS (Azure is not preferred), along with expertise in containerization technologies like Docker and Kubernetes. Strong understanding of database systems and data modeling. Experience with CI/CD pipelines and automation tools. Strong leadership and mentorship skills. Excellent communication and interpersonal skills. Strong problem-solving and analytical skills. Ability to work independently and as part of a team. Strong attention to detail and a commitment to quality. Ability to learn and adapt to new technologies quickly. Strategic thinking and planning skills. ","Experience":"5-8","
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Position: Guidewire Integration QA Experience: 5+ years Key Responsibilities: Perform functional, integration, and end-to-end testing for Guidewire InsuranceSuite (PolicyCenter, BillingCenter, ClaimCenter) integrations. Validate APIs, web services, and data flows across various upstream/downstream systems. Develop and maintain test plans, test cases, and test scripts for integration scenarios. Work closely with developers and business analysts to ensure high-quality deliverables. Conduct regression testing, defect management, and root cause analysis. Must-Have Skills: Strong knowledge of Guidewire InsuranceSuite (PC/BC/CC) with a focus on integration points. Hands-on experience with SOAP/REST API testing (Postman, SOAP UI, or similar tools). Experience with SQL queries and database validation. Familiarity with test automation frameworks (e.g., Selenium, TestNG, or similar). Understanding of insurance domain processes. Good to Have: Exposure to CI/CD pipelines (Jenkins, Git). Knowledge of Gosu scripting and Guidewire data model. Experience with performance and security testing tools. ",
Posted 1 week ago
2.0 - 7.0 years
4 - 9 Lacs
Gurugram
Work from Office
Ensuring rake availability for loading in coordination with LSPs and Indian Railways & achievement of Railway dispatches as per monthly plan. 2. Coordination with MSIL planning team for timely planning of rakes from respective plants. 3. Ensuring timely invoicing/retrieval at plant in coordination with LSP/SND 4. Ensuring on time departure and timely arrival of rake at destination thereby maintaining overall standard transit time 5.Optimise TAT (Turn around time) of rake by reducing the arrival to placement,Loading & drawn out time. 6. Coordination with LSPs and aligning of fleet for managing first mile & last mile dispatch as per norms of MSIL 7. Coordination with all stakeholders within MSIL plant and Railways for resolution of issues (Electricity failure/OHE failure,derailments,P-way, damages etc) 8. Preparation of Business Plan and Strategies. 9. Preparation of MIS 10.Coordination with teams at TVPs and Port for Railway dispatches. 11. RFQ and Rate negotiation for New and existing destinations . 12. Railway Liaisoning & over all corordination within MSIL & with LSPS Strong knowledge of Channel Management - Dealers & Distribution Proficiency in MS Excel and Data modelling Knowledge of Power BI is preferred Data Analysis and Data Visualization with ability to handle large data sets Strong Interpersonal Skills & collaborative approach Key Account Mangement Skills also preferred
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Chennai
Work from Office
Job Title: ServiceNow Developer Location: Offshore Remote Experience Required: 5+ years Job Type: Full Time Job Summary: We are looking for a skilled and experienced ServiceNow Developer to join our team. The ideal candidate will have strong scripting capabilities and be well-versed in end-to-end ServiceNow development. You will play a critical role in designing, configuring, and customizing the platform to meet business requirements, particularly across ITSM modules and beyond. Candidates should be skilled in all ServiceNow development (not specific to ITAM) with extensive experience with ITSM. Key Responsibilities: Develop and customize core applications using ServiceNow platform tools and best practices Write clean, scalable server-side and client-side scripts using JavaScript and Glide APIs Implement workflows, business rules, UI actions, client scripts, and scheduled jobs Integrate ServiceNow with third-party applications and systems Participate in design sessions, provide architectural guidance, and ensure scalability of solutions Troubleshoot and resolve application issues and defects Maintain documentation for technical designs and code changes Collaborate with stakeholders, including ITSM process owners, to gather requirements and deliver tailored solutions Required Skills & Experience: 4+ years of hands-on experience with ServiceNow development Strong scripting experience (JavaScript, Glide, Script Includes, Business Rules, etc.) Expertise in ServiceNow ITSM modules (Incident, Problem, Change, CMDB, Knowledge) Solid understanding of ServiceNow architecture and data model Experience with Flow Designer, IntegrationHub, and REST/SOAP integrations Knowledge of Agile/Scrum methodologies Familiarity with Service Portal and custom widget development is a plus ServiceNow certifications (CSA, CAD) are desirable Nice to Have: Experience with Scoped Applications and App Engine Studio Knowledge of ITOM, SecOps, or HRSD modules Exposure to CI/CD pipelines with ServiceNow
Posted 1 week ago
7.0 - 12.0 years
9 - 14 Lacs
Noida, Bengaluru
Work from Office
Job Summary: We are seeking a highly skilled Technical Project Manager with strong expertise in Power BI, Azure and Agile methodologies to lead and deliver data-driven projects. The ideal candidate will have a strong technical background combined with excellent leadership, coordination, and communication skills to drive successful project outcomes. Key Responsibilities: Manage end-to-end delivery of BI and cloud-based projects across multiple teams. Work closely with business stakeholders to gather requirements, define project scope, timelines, and deliverables. Lead Agile ceremonies sprint planning, daily stand-ups, retrospectives, and sprint reviews. Oversee development and deployment of Power BI dashboards and reports, ensuring high-quality data visualization and insights. Coordinate with Azure engineering teams for data pipelines, integration, and platform readiness. Identify project risks and develop mitigation strategies. Ensure projects are delivered on time, within scope, and within budget. Act as the primary point of contact for technical teams, business users, and leadership. Technical Skills Required: Power BI: Data modeling, DAX, Power Query, report development, performance optimization. Azure: Azure Data Factory, Azure Synapse, Azure SQL Database, Azure DevOps. Experience with cloud data architecture and integration. Familiarity with CI\/CD pipelines and release management (Azure DevOps preferred). Project Management Skills: 7+ years of experience managing technical projects. Strong expertise in Agile frameworks (Scrum, Kanban, SAFe). Experience managing cross-functional technical teams. Excellent documentation, reporting, and stakeholder management skills.
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Designation - PowerBI Lead / Sr. Dev Work Location - Hyderabad (Hybrid) Experience - 5 to 10 years Responsibilities: Understand business requirements and translate them into technical specifications for Power BI reports and dashboards. Design, develop, and publish interactive dashboards using Power BI (Power BI Desktop & Power BI Service). Integrate data from various sources including SQL Server, Excel, SharePoint, and cloud-based sources (Azure, etc.). Build and optimize data models (star/snowflake schema) and DAX queries for performance and usability. Develop and manage ETL processes using Power Query, Dataflows, or Azure Data Factory. Implement row-level security (RLS) and access controls within Power BI reports. Perform data validation and ensure accuracy of reports and dashboards. Collaborate with stakeholders, business analysts, and data engineers to ensure report alignment with business goals. Monitor Power BI service performance and schedule dataset refreshes. Stay up-to-date with Power BI updates, best practices, and industry trends. Required Skills & Qualifications: Bachelor s degree in Computer Science, Information Systems, or a related field. 5+ years of hands-on experience in Power BI report and dashboard development. Proficiency in DAX, Power Query (M language), and data modeling. Strong SQL skills ability to write complex queries and optimize them. Experience in integrating Power BI with cloud data sources (e.g., Azure SQL, Data Lake). Familiarity with Power BI Gateway, Power BI Report Server (optional), and Workspace Management. Solid understanding of data warehousing concepts and relational databases. Strong analytical and problem-solving skills. Excellent communication and stakeholder management abilities. Preferred Qualifications: Microsoft Power BI cerrtifications. Exposure to other BI tools like Tableau or QlikView (optional). Experience working in Agile or Scrum teams.
Posted 1 week ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Job Summary We are seeking a skilled Guidewire BillingCenter Configuration Developer to join our dynamic team working on enterprise-scale insurance platforms. The ideal candidate will have strong hands-on experience with BillingCenter configuration, business rules, and UI customization using Gosu. Key Responsibilities Configure and customize Guidewire BillingCenter components using Gosu Develop and maintain business rules, workflows, and UI elements Integrate BillingCenter with internal and third-party systems via SOAP/REST APIs Collaborate with business analysts, architects, and QA teams to deliver high-quality solutions Participate in Agile/Scrum ceremonies and contribute to sprint planning and delivery Must-Have Skills 3+ years of experience with Guidewire BillingCenter (version 10.x preferred) Proficiency in Gosu programming language Strong experience in BillingCenter Configuration : PCFs, Data Model, Business Rules, Plugins Good understanding of Guidewire product architecture and integration frameworks Hands-on experience with SOAP/REST APIs , GUnit, Jenkins, GIT Nice-to-Have Skills Experience with Guidewire Cloud Platform (GWCP) Exposure to DevOps, CI/CD tools (Jenkins, Docker, Kubernetes) Familiarity with other Guidewire modules (PolicyCenter/ClaimCenter) Insurance domain knowledge (especially billing processes) ",
Posted 1 week ago
10.0 - 17.0 years
35 - 40 Lacs
Pune
Work from Office
Job Overview: As a Sr Specialist - Software Development, you will develop new product features/modules using best practices and provide maintenance to the existing systems. All our products are solutions for the airline, transport and travel industry using different technologies. Responsibilities: Translate processes and enhancement specifications into programs Develop and refine error-free code within agreed timescales using development techniques, tools, methods and languages with the aim of optimizing operational efficiency. Evaluate changes and perform impact analysis Work with functional staff to establish and clarify requirements Investigate reported faults in operational code to determine changes and approaches to the code for promotion and replacement, conforming to established procedures. Design and prepare unit testing plan strategies and write test scripts to validate all new software development and enhancements. Take ownership of the test and implementation phases of projects. Qualifications: Bachelor s degree in computer science, Engineering, or related field (or equivalent work experience). 8+ years experience in software development Strong problem solving and analytical skills. ASP.NET 4.0, C#, DOT NET Framework & MVC 3.0, Entity Framework, SQL, AWS Cloud, Node JS Strong knowledge and practical experience with AWS services for backend development and deployment. Experience in implementing and maintaining Test-Driven Development (TDD) practices. Familiarity with database technologies (e.g., SQL, NoSQL databases) and data modeling. Understanding of RESTful API design principles and best practices. Solid understanding of software development methodologies and practices. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities in a team environment. Preferred Skills: Experience with serverless architectures (AWS Lambda, etc.). Familiarity with CI/CD pipelines and related tools. XML, XSLT, REST API, LINQ Knowledge of performance optimization techniques for backend systems. Understanding of security principles and best practices in backend development.
Posted 1 week ago
4.0 - 9.0 years
6 - 11 Lacs
Chennai
Work from Office
At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. ZoomInfo is seeking a Business Intelligence Analyst III to play a pivotal role in scaling our product analytics operations and empowering teams with the tools and insights they need to make fast, data-driven decisions. Based in India, this role is at the center of enabling world-class product intelligence, ensuring our tools, dashboards, and systems run seamlessly and are fully leveraged by product managers across the organization. You ll work closely with Product Managers, Product Operations, Data Engineering, and other stakeholders to maintain, improve, and evangelize analytics tooling. You ll also be responsible for increasing tool adoption and proficiency across the product organization, ensuring every team has the skills and access they need to extract value from our data stack. This role is ideal for someone who loves blending technical expertise with cross-functional enablement, and is excited about making product analytics more accessible, scalable, and impactful. You will serve as the primary expert and champion for analytics tools like Amplitude and Tableau, ensuring every team can self-serve insights and make confident decisions. What Youll Do: Lead Amplitude administration and governance across the organization, including managing user permissions, monitoring and optimizing event and property usage, maintaining clean and scalable instrumentation, curating key dashboards and cohorts, and driving best practices to ensure long-term data hygiene and analytics consistency. Drive product analytics enablement : lead training and upskilling efforts for Product Managers and other stakeholders in tools like Amplitude and Tableau, introducing advanced features (e.g., screen recordings, heatmaps, and in-platform guides). Maintain and optimize Tableau dashboards : ensure business-critical dashboards are accurate, performant, and relevant to evolving product and business needs. Own Tableau online administration , including project structure, permissions, consistent naming conventions, and documentation. Maintain and optimize business-critical dashboards to ensure they are accurate, performant, and aligned with evolving product and business needs Establish and maintain best practices in tool usage and data accessibility, partnering with Data Engineering and Product Operations to improve data literacy across the organization. Document and operationalize tooling workflows for product analytics processes, such as event naming conventions, funnel tracking, retention metrics, and user segmentation. Monitor analytics tool adoption and effectiveness : gather feedback, identify gaps, and implement improvements to ensure teams are getting the most out of our analytics investments. Act as a bridge between technical and non-technical teams : translate business needs into technical requirements and vice versa, ensuring tool functionality aligns with real-world usage. Proactively recommend tooling enhancements and identify opportunities to scale self-service analytics capabilities across product teams. Automate recurring reports and enable a self-service analytics environment : Streamline regular reporting processes through automation and empower teams to independently explore insights and access key metrics via intuitive, reusable dashboards. Support ad-hoc analytical requests from Product and cross-functional teams : Translate business questions into structured analyses and deliver timely, actionable insights to inform decisions. What You Bring: Bachelor s degree in Analytics, Computer Science, Information Systems, or a related field. 4+ years of experience in business intelligence, product analytics, or data operations within a SaaS or tech environment. At least 1-2 years of experience with Amplitude and at least 2 years with Tableau, including administration, dashboard development, and stakeholder training. Strong SQL skills with at least 4 years of hands-on experience, including query optimization and a solid understanding of data modeling concepts. Proven ability to lead enablement programs and deliver effective training content for both technical and non-technical audiences. Strong communication and collaboration skills to work effectively with cross-functional stakeholders. Experience managing tooling documentation, data taxonomies, or analytics governance frameworks (preferred). A passion for helping teams work smarter and more effectively with data. Strong project management skills to lead multiple initiatives in a fast-paced, data-driven environment. Bonus Experience: Experience with dbt, Snowflake and ETL tools Knowledge of product lifecycle metrics or product experimentation (e.g., A/B testing platforms) Data quality monitoring tools (e.g., Monte Carlo, Great Expectations) Python knowledge and experience Familiarity with tools like Looker, Mixpanel, Google Analytics, or other BI platforms. Experience working in a distributed or global team environment. #LI-PR #LI-Hybrid About us: ZoomInfo (NASDAQ: GTM) is the Go-To-Market Intelligence Platform that empowers businesses to grow faster with AI-ready insights, trusted data, and advanced automation. Its solutions provide more than 35,000 companies worldwide with a complete view of their customers, making every seller their best seller.
Posted 1 week ago
3.0 - 15.0 years
5 - 17 Lacs
Pune
Work from Office
Skills: Power Apps + Power Automate + Dataverse+ Any combination of (Copilot studio / Azure foundry / Power BI ) Location Pan India Experience Range 3 to 15 years Key Responsibilities: Lead the development of custom applications using Power Apps (Canvas and Model-Driven Apps). Design and implement complex UI in Canvas Apps and build critical workflows and approval processes using Power Automate. Work extensively with Microsoft Dataverse for data modeling and integration. Develop and integrate custom connectors and use APIs within Power Apps. Customize views and forms using JavaScript as per business requirements. Design and develop PCF controls for both Canvas and Model-Driven Apps. Create and manage Power BI reports and dashboards to support business insights. Work with Copilot Studio (formerly Power Virtual Agents) to build intelligent chatbots. Contribute to Power Pages development for external-facing applications. Implement CI/CD pipelines using Azure DevOps for streamlined development and deployment. Collaborate effectively with cross-functional teams while being able to work independently when required. Communicate clearly with technical and non-technical stakeholders. Required Skills and Experience: Minimum of 3 years of experience working on Microsoft Power Platform. Proven involvement in at least 3 full-cycle implementation projects in a Senior Developer role. Strong proficiency in Power Apps, Power Automate, Dataverse, and API integrations. Good understanding of JavaScript and custom development for platform extensibility. Familiarity with Power BI, Copilot Studio, Power Pages, and Azure DevOps. Key Attributes: Strong problem-solving skills and attention to detail. Excellent verbal and written communication skills. Ability to work autonomously and as part of a collaborative team.
Posted 1 week ago
6.0 - 9.0 years
8 - 11 Lacs
Hyderabad
Work from Office
Company Overview We are looking for an exceptionally talented professional to join one of our cross-functional product teams in our Hyderabad/Bangalore/Gurgaon office for an Individual contributor role. This product team is responsible for building the investor allocation product offering, PerformA TM . This position offers opportunity to define and design the next generation of products on our platform, which is used by some of the most sophisticated hedge funds in the world; and to collaborate with some of the brightest minds in the industry. What you ll do: Work closely with the engineers/architects to translate the Product Specification to design, and then to the product itself Prepare comprehensive business test cases/beds to aid the engineering process Rigorously and continuously evaluate the progress of product/feature-in-flight by leveraging the created test cases/beds and ensure compliance to the product/feature specification and the vision Prepare prototypes using Python and AI Track and question risks/assumptions Proactivly escalate issues and mitigate execution risks What you ll need: 6 to 9 years of experience working in working in the front, middle and/or back-office space with minimum 3 years of Fund Accounting/ Investor Allocation experience Technical skills needed Familiarity with all phases of Software Development Life Cycle Strong grasp of programming fundamentals in a high-level language like Java, Python, or Javascript for simple scripting, quick prototyping or understanding code reviews. Not requiring deep development expertise, but enough to engage meaningfully with engineers. Knowledge of core concepts like APIs, microservices, and basic programming paradigms. Working knowledge of databases, data modeling, and basics of SQL. Basic understanding of cloud services (e.g., AWS, Azure) and how they impact scalability and deployment. Ability to evaluate technical trade-offs in product features. Ability to quickly learn and adapt to new systems, platforms, or tools as required by the project Exceptional verbal and written communication skills Critical thinking and the ability to articulate standpoints/ideas and influence stakeholders Candidate should have a graduate degree in software engineering Advanced knowledge in field of Fund Accounting and Investor Allocations will be an added advantage.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough