Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Snowflake SchemaMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams to ensure seamless integration and functionality of applications. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to ensure application functionality.- Conduct code reviews and provide feedback for continuous improvement.- Troubleshoot and resolve application issues in a timely manner.- Stay updated on industry trends and best practices for application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of Snowflake Schema.- Experience in designing and implementing data warehouse solutions.- Knowledge of ETL processes and data modeling techniques.- Familiarity with SQL and database management systems. Additional Information:- The candidate should have a minimum of 3 years of experience in Snowflake Data Warehouse.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 3 weeks ago
8.0 - 10.0 years
25 - 30 Lacs
Noida
Work from Office
You have an opportunity to personally thrive, make a difference and be part of a culture where individuality is noticed and valued every day. Job Overview We are looking for a Data Engineer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is the acquisition, transformation, loading and processing of data from a multitude of disparate data sources, including structured and unstructured data for advanced analytics and machine learning in a big data environment. Responsibilities: -Engineer a modern data pipeline to collect, organize, and process data from disparate sources. -Performs data management tasks, such as conduct data profiling, assess data quality, and write SQL queries to extract and integrate data - Develop efficient data collection systems and sound strategies for getting quality data from different sources -Consume and analyze data from the data pool to support inference, prediction and recommendation of actionable insights to support business growth. -Design and develop ETL processes using tools and scripting. Troubleshoot and debug ETL processes. Performance tuning and opitimization of the ETL processes. -Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. - Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. -Learn and develop new ETL techniques as required to keep up with the contemporary technologies. -Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. -Support presentations to Customers and Partners -Advising on new technology trends and possible adoption to maintain competitive advantage Experience Needed: -8+ years of related experience is required. -A BS or Masters degree in Computer Science or related technical discipline is required -ETL experience with data integration to support data marts, extracts and reporting -Experience connecting to varied data sources -Excellent SQL coding experience with performance optimization for data queries. -Understands different data models like normalized, de-normalied, stars, and snowflake models. Worked with transactional, temporarl, time series, and structured and unstructured data. -Experience on Azure Data Factory and Azure Synapse Analytics -Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. -Experience in cloud-based ETL development processes. -Experience in deployment and maintenance of ETL Jobs. -Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. -Has strong technical background and remains evergreen with technology and industry developments. At least 3 years of demonstrated success in software engineering, release engineering, and/or configuration management. Highly skilled in scripting languages like PowerShell. Substantial experience in the implementation and exectuion fo CI/CD processes. Additional Requirements -Demonstrated ability to have successfully completed multiple, complex technical projects -Prior experience with application delivery using an Onshore/Offshore model -Experience with business processes across multiple Master data domains in a services based company -Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. -Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. -Is able to make sound and far reaching decisions alone on major issues and to take full responsibility for them on a technical basis. -Strong written communication skills. Is effective and persuasive in both written and oral communication. - Experience with gathering end user requirements and writing technical documentation - Time management and multitasking skills to effectively meet deadlines under time-to-market pressure
Posted 3 weeks ago
14.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description A passion for high quality software engineering and technology An interest in working on large-scale data challenges across different technologies Strong educational credentials Drive and ambition to achieve, thrive, and succeed Likes to solve challenges in connectivity space which offer mostly desktop system software experience Our engineering managers are both technically gifted and can work well with others to solve difficult problems. Technologies frequently change, and the successful candidate must have the ability to rapidly master new software languages and technologies. Tasks Analyzes, designs, develops and documents commercial software products Work on the research, development, testing, and maintenance of one or more ISW products Demonstrates high aptitude in a variety of software engineering concepts, practices, and procedures Relies on extensive experience and judgment to plan and accomplish goals Take ownership of development work through to delivery, including fully automated testing of components Provides technical support to project team members. Participate in design and code reviews Manages the team as a supervisor and mentor to accomplish engineering tasks and objectives. Will perform team leadership for a team of engineers Sets goals and measures against the goals. Changes the processes and upskills the team as need arises Qualifications 14-17 years of desktop and web applications web and desktop applications leveraging industry standards Strong knowledge and experience in C, C++, VC++, MFC, Manage C++, WinForm and Java application development. 5+ years of People Management Experience managing and mentoring a team of 8+ Engineers and Senior Engineers Experience with OLAP databases (like TM1/Infor/SAP/Oracle) Experience with Microsoft Server Analysis Services (MDX) Knowledge on Linux (Ubuntu, RedHat) Proficiency in JAVA Spring Boot framework and frontend - Vue.js preferred Knowledge of RESTful APIs and integration techniques. Front end development: HTML, CSS, JavaScript, Vue.js Familiarity with public clouds such as AWS or Azure Familiarity with real-time streaming systems Knowledge of RESTful APIs and integration techniques. Familiarity with version control systems like Git. Strong problem-solving skills and attention to detail. Excellent communication and leadership skills. Collaborate with team members to define project requirements, priorities, and timelines. Familiarity with relational databases and SQL Good To Have Familiarity with Hazelcast in-memory data grid Shell scripting Additional Information All your information will be kept confidential according to EEO guidelines. ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** insightsoftware About Us: Hear From Our Team - InsightSoftware (wistia.com) Background checks are required for employment with insightsoftware, where permitted by country, state/province. At insightsoftware, we are committed to equal employment opportunity regardless of race, color, ethnicity, ancestry, religion, national origin, gender, sex, gender identity or expression, sexual orientation, age, citizenship, marital or parental status, disability, veteran status, or other class protected by applicable law. We are proud to be an equal opportunity workplace.
Posted 3 weeks ago
20.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Over the past 20 years Amazon has earned the trust of over 300 million customers worldwide by providing unprecedented convenience, selection and value on Amazon.com. By deploying Amazon Pay’s products and services, merchants make it easy for these millions of customers to safely purchase from their third party sites using the information already stored in their Amazon account. In this role, you will lead Data Engineering efforts to drive automation for Amazon Pay organization. You will be part of the data engineering team that will envision, build and deliver high-performance, and fault-tolerant data pipeliens. As a Data Engineer, you will be working with cross-functional partners from Science, Product, SDEs, Operations and leadership to translate raw data into actionable insights for stakeholders, empowering them to make data-driven decisions. Key job responsibilities Design, implement, and support a platform providing ad-hoc access to large data sets Interface with other technology teams to extract, transform, and load data from a wide variety of data sources Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, Redshift, and OLAP technologies Model data and metadata for ad-hoc and pre-built reporting Interface with business customers, gathering requirements and delivering complete reporting solutions Build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. Build and deliver high quality data sets to support business analyst, data scientists, and customer reporting needs. Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Basic Qualifications 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Preferred Qualifications Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A3011919
Posted 3 weeks ago
6.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
Company Description Wiser Solutions is a suite of in-store and eCommerce intelligence and execution tools. We're on a mission to enable brands, retailers, and retail channel partners to gather intelligence and automate actions to optimize in-store and online pricing, marketing, and operations initiatives. Our Commerce Execution Suite is available globally. Job Description We are looking for a highly capable Senior Full Stack engineer to be a core contributor in developing our suite of product offerings. If you love working on complex problems, and writing clean code, you will love this role. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze semi-structured data from different sources (20 million+ products from 500+ websites into our catalog of 500 million+ products). We help our customers discover new patterns in their data that can be leveraged so that they can become more competitive and increase their revenue. Essential Functions: Think like our customers – you will work with product and engineering leaders to define intuitive solutions Designing customer-facing UI and back-end services for various business processes. Developing high-performance applications by writing testable, reusable, and efficient code. Implementing effective security protocols, data protection measures, and storage solutions. Improve the quality of our solutions – you will hold yourself and your team members accountable to writing high quality, well-designed, maintainable software Own your work – you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table – some of our best innovations originate within the team Guiding and mentoring others on the team Technologies We Use: Languages: NodeJS/NestJS/Typescript, SQL, React/Redux, GraphQL Infrastructure: AWS, Docker, Kubernetes, Terraform, GitHub Actions, ArgoCD Databases: Postgres, MongoDB, Redis, Elasticsearch, Trino, Iceberg Streaming and Queuing: Kafka, NATS, Keda Qualifications 6+ years of professional software engineering/development experience. Proficiency with architecting and delivering solutions within a distributed software platform Full stack engineering experience, including front end frameworks (React/Typescript, Redux) and backend technologies such as NodeJS/NestJS/Typescript, GraphQL Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Proven ability to work and effectively, prioritize and organize your work in a highly dynamic environment Proven track record of working in highly distributed event driven systems. Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Trino, etc.) Solid understanding of Data Pipeline and Workflow Automation – orchestration tools, scheduling and monitoring Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of Data Lakes, Data Warehouses, and modeling practices (Data Vault, etc.) and experience leveraging data lake solutions (e.g. AWS Glue, DBT, Trino, Iceberg, etc.) . Ability to clean, transform, and aggregate data using SQL or scripting languages Ability to design and estimate tasks, coordinate work with other team members during iteration planning Solid understanding of AWS, Linux and infrastructure concepts Track record of lifting and challenging teammates to higher levels of achievement Experience measuring, driving and improving the software engineering process Good testing habits and strong eye for quality. Outstanding organizational, communication, and relationship building skills conducive to driving consensus; able to work well in a cross-functional environment Experience working in an agile team environment Ownership – feel a sense of personal accountability/responsibility to drive execution from start to finish. Drive adoption of Wiser's Product Delivery organization principles across the department. Bonus Points Experience with CQRS Experience with Domain Driven Design Experience with C4 modeling Experience working within a retail or ecommerce environment Experience with AI Coding Agents (Windsurf, Cursor, Claude, ChatGPT, etc) – Prompt Engineering Why Join Wiser Solutions? Work on an industry-leading product trusted by top retailers and brands. Be at the forefront of pricing intelligence and data-driven decision-making. A collaborative, fast-paced environment where your impact is tangible. Competitive compensation, benefits, and career growth opportunities. Additional Information EEO STATEMENT Wiser Solutions, Inc. is an Equal Opportunity Employer and prohibits Discrimination, Harassment, and Retaliation of any kind. Wiser Solutions, Inc. is committed to the principle of equal employment opportunity for all employees and applicants, providing a work environment free of discrimination, harassment, and retaliation. All employment decisions at Wiser Solutions, Inc. are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion, sex, national origin, family or parental status, disability, genetics, age, sexual orientation, veteran status, or any other status protected by the state, federal, or local law. Wiser Solutions, Inc. will not tolerate discrimination, harassment, or retaliation based on any of these characteristics
Posted 3 weeks ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title : Data Architect Location: Noida, India Data Architecture Design: Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams. Develop a data strategy and roadmap that aligns with business objectives and ensures the scalability of data systems. Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency. Data Integration & Management: Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools. Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets. Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.). Collaboration with Stakeholders: Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs. Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines. Technology Leadership: Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools. Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization. Data Quality & Security: Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems. Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity. Mentorship & Leadership: Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management. Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy. Required Skills & Experience: Extensive Data Architecture Expertise: Over 7 years of experience in data architecture, data modeling, and database management. Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions. Strong experience with data integration tools (Azure Tools are a must + any other third party tools), ETL/ELT processes, and data pipelines. Advanced Knowledge of Data Platforms: Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus. Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing. Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker). Data Governance & Compliance: Strong understanding of data governance principles, data lineage, and data stewardship. Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards. Technical Leadership: Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise. Strong programming skills in languages such as Python, SQL, R, or Scala. Certification: Azure Certified Solution Architect, Data Engineer, Data Scientist certifications are mandatory. Pre-Sales Responsibilities: Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives. Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained. Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions. Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process. Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements. Additional Responsibilities: Stakeholder Collaboration: Engage with stakeholders to understand their requirements and translate them into effective technical solutions. Technology Leadership: Provide technical leadership and guidance to development teams, ensuring the use of best practices and innovative solutions. Integration Management: Oversee the integration of solutions with existing systems and third-party applications, ensuring seamless interoperability and data flow. Performance Optimization: Ensure solutions are optimized for performance, scalability, and security, addressing any technical challenges that arise. Quality Assurance: Establish and enforce quality assurance standards, conducting regular reviews and testing to ensure robustness and reliability. Documentation: Maintain comprehensive documentation of the architecture, design decisions, and technical specifications. Mentoring: Mentor fellow developers and team leads, fostering a collaborative and growth-oriented environment. Qualifications: Education: Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Experience: Minimum of 7 years of experience in data architecture, with a focus on developing scalable and high-performance solutions. Technical Expertise: Proficient in architectural frameworks, cloud computing, database management, and web technologies. Analytical Thinking: Strong problem-solving skills, with the ability to analyze complex requirements and design scalable solutions. Leadership Skills: Demonstrated ability to lead and mentor technical teams, with excellent project management skills. Communication: Excellent verbal and written communication skills, with the ability to convey technical concepts to both technical and non-technical stakeholders.
Posted 3 weeks ago
5.0 - 12.0 years
0 Lacs
karnataka
On-site
Job Description: As an Engineering Manager, you will lead a high-performing team of 8-12 engineers and engineering leads in the end-to-end delivery of software applications through sophisticated CI/CD pipelines. Your role involves mentoring engineers to build scalable, resilient, and robust cloud-based solutions for Walmart's suite of products, contributing to quality and agility. Within Enterprise Business Services, the Risk Tech/Financial Services Compliance team focuses on designing, developing, and operating large-scale data systems and real-time applications. The team works on creating pipelines, aggregating data on Google Cloud Platform, and collaborating with various teams to provide technical solutions. Key Responsibilities: - Manage a team of engineers and engineering leads across multiple technology stacks, including Java, NodeJS, and Spark with Scala on GCP. - Drive design, development, and documentation processes. - Establish best engineering and operational practices based on product and scrum metrics. - Interact with Walmart engineering teams globally, contribute to the tech community, and collaborate with product and business stakeholders. - Work with senior leadership to plan the future roadmap of products, participate in hiring and mentoring, and lead technical vision and roadmap development. - Prioritize feature development aligned with strategic objectives, establish clear expectations with team members, and engage in organizational events. - Collaborate with business owners and technical teams globally, and develop mid-term technical vision and roadmap to meet future requirements. Qualifications: - Bachelor's/Master's degree in Computer Science or related field with a minimum of 12+ years of software development experience and at least 5+ years of managing engineering teams. - Experience in managing agile technology teams, building Java, Scala-Spark backend systems, and working in cloud-based solutions. - Proficiency in JavaScript, NodeJS, ReactJS, NextJS, CS Fundamentals, Microservices, Data Structures, and Algorithms. - Strong skills in CI/CD development environments/tools, writing modular and testable code, microservices architecture, and working with relational and NoSQL databases. - Hands-on experience with technologies like Spring Boot, concurrency, RESTful services, and cloud platforms such as Azure, GCP. - Knowledge of containerization tools like Docker, Kubernetes, and monitoring/alert tools like Prometheus, Splunk. - Ability to lead a team, contribute to technical design, and collaborate across geographies. About Walmart Global Tech: Walmart Global Tech is a team of software engineers, data scientists, and service professionals at the forefront of retail disruption. We innovate to impact millions and reimagine the future of retail, offering opportunities for personal growth, skill development, and innovation at scale. Flexible Work Approach: Our hybrid work model combines in-office and virtual presence, ensuring collaboration, flexibility, and personal development opportunities across our global team. Benefits: In addition to competitive compensation, we offer incentive awards, best-in-class benefits, maternity/paternal leave, health benefits, and more. Equal Opportunity Employer: Walmart, Inc. is committed to diversity, inclusivity, and valuing unique identities, experiences, and opinions. We strive to create an inclusive environment where all individuals are respected and valued. Minimum Qualifications: - Bachelor's degree in computer science or related field with 5 years of experience in software engineering or 7 years of experience in software engineering with 2 years of supervisory experience. Preferred Qualifications: - Master's degree in computer science or related field with 3 years of experience in software engineering. Location: Pardhanani Wilshire II, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli, India R-1998235.,
Posted 3 weeks ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As an Azure Data Factory Engineer at Aspire Systems, you will be responsible for designing, developing, and maintaining robust data pipelines and ETL processes. Your role will involve implementing and optimizing data storage solutions in data warehouses and data lakes. You should have strong experience with Microsoft Azure tools, including SQL Azure, Azure Data Factory, Azure Databricks, and Azure Data Lake, coupled with excellent communication skills. Key Responsibilities: - Design, develop, and maintain robust data pipelines and ETL processes. - Implement and optimize data storage solutions in data warehouses and data lakes. - Collaborate with cross-functional teams to understand data requirements and deliver high-quality data solutions. - Utilize Microsoft Azure tools for data integration, transformation, and analysis. - Develop and maintain reports and dashboards using Power BI and other analytics tools. - Ensure data integrity, consistency, and security across all data systems. - Optimize database and query performance to support data-driven decision-making. Qualifications: - 7-10 years of professional experience in data engineering or a related field. - Profound expertise in SQL, T-SQL, database design, and data warehousing principles. - Strong experience with Microsoft Azure tools, including SQL Azure, Azure Data Factory, Azure Databricks, and Azure Data Lake. - Proficiency in Python, PySpark, and PySQL for data processing and analytics tasks. - Experience with Power BI and other reporting and analytics tools. - Demonstrated knowledge of OLAP, data warehouse design concepts, and performance optimizations in database and query processing. - Excellent problem-solving, analytical, and communication skills. Join Aspire Systems, a global technology services firm that serves as a trusted technology partner for over 275 customers worldwide. Aspire collaborates with leading enterprises in Banking, Insurance, Retail, and ISVs, helping them leverage technology to thrive in the digital era. With a focus on Software Engineering & Digital Technologies, Aspire enables companies to operate smart business models. The company's core philosophy of Attention. Always. reflects its commitment to providing exceptional care and attention to customers and employees. Aspire Systems is CMMI Level 3 certified and has a global workforce of over 4900 employees, operating across North America, LATAM, Europe, Middle East, and Asia Pacific. Aspire Systems has been consistently recognized as one of the Top 100 Best Companies to Work For by the Great Place to Work Institute for 12 consecutive years. Explore more about Aspire Systems at https://www.aspiresys.com/.,
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description As a Data Engineer, you will be part of a Data and Analytics (D & A) team responsible for building data pipelines that enables us to make informed decisions across the entire organization. This is a great opportunity to make a real impact on the course of the company, which makes data-based decisions as a part of its Data + Analytics Strategy. The Data Engineer is responsible for design, development, testing, and implementation of automated data pipelines and for the Enterprise Data Warehouse, hosted in the Cloud. The Data Engineer works closely with Business Intelligence / Analytics teams and business users to understand requirements, translate them into technical design, develop data pipelines and implement solutions in the Enterprise Data Warehouse (Redshift). Primary Responsibilities Include Analyze existing & create new stored procedures which involve complex data models and business rules. Build data pipelines utilizing various ETL transformation tools such as informatica or AWS Glue Actively participate through all phases of the project cycle from ideation to post-implementation stabilization. Work with business and technical peers to define how best to meet requirements, balancing speed & robustness. Build high-quality, maintainable SQL OLAP/Analytic functions following established patterns and coding practices. Analyze technical data to establish solutions that achieve complex data transformations. Participate / perform testing to ensure data quality and integrity via unit, integration, regression, and UAT testing. Create and maintain process design, data model, and operations documentation. Assist in the maintenance of the codebase, unit tests, and related technical design docs and configurations. Engage and collaborate with stakeholders via the Agile process, identifying and mitigating risks & issues as needed. Maintain software velocity & quality for deliverables, holding oneself accountable to commitments. JOB Requirements (minimum Competencies Required For Job Performance) Experience in PL/SQL scripting and query optimization, required. Experience with AWS (Amazon Web Services) Redshift, Oracle, or PostgreSQL, preferred. Experience with Informatica Power Center and/or Informatica Cloud / IDMC, preferred. Experience in data model design, dimensional data modeling, and complex stored procedure development, required. Strong analytical skills, synthesizing information with attention to detail & accuracy to establish patterns and solutions. Experience with AWS, e.g., S3, PySpark, Glue, Redshift, Lambda, preferred. Experience with Data Lake House platforms, e.g., Databricks, Snowflake, preferred. Experience in scripting languages, e.g., Python, Scala, Java, Unix Shell, Bash, preferred. Experience operating in Agile and Waterfall development methodologies, preferred. Experience building data visualization solutions using BI platforms, e.g., Tableau, Power BI, Qlik, preferred. Capable of balancing technology ideals and business objectives, evaluating options and implications. Must possess strong written and verbal communication skills. Manages and prioritizes work effectively with minimal supervision, seeking and offering help as needed to achieve goals. Adaptable to change and able to work independently and as part of a team. Applies curiosity and creativity to solve problems, seeking opportunities and overcoming challenges with resourcefulness. High bias for action in meeting commitments & deadlines; effectively sees, communicates, and mitigates risks and issues. Active participant in the development community; seeks and offers guidance, coaching, and professional development. (ref:hirist.tech)
Posted 3 weeks ago
7.0 - 10.0 years
25 - 30 Lacs
Pune
Hybrid
Role Description We are seeking senior and skilled Tableau and Google BigQuery professional to join our team for a project involving the modernization of existing Tableau reports in Google BigQuery. Skills & Qualifications Bachelors degree in computer science / information technology / related field, with 8 plus years of experience in IT/Software field Proven experience working with Tableau, including creating and maintaining dashboards and reports. Prior experience working with Cognos, including creating and maintaining dashboards and reports. Strong understanding of SQL and database concepts. Familiarity with ETL processes and data validation techniques. Hands-on experience with Google BigQuery and related components/services like Airflow, Composer, etc.. Strong communication and collaboration abilities. Good to have prior experience in data/reports migration from on-premises to cloud
Posted 3 weeks ago
2.0 - 3.0 years
5 - 9 Lacs
Vadodara
Work from Office
Job Description We are looking for a highly capable Senior Full Stack engineer to be a core contributor in developing our suite of product offerings. If you love working on complex problems, and writing clean code, you will love this role. Our goal is to solve a messy problem elegantly and cost effectively. Our job is to collect, categorize, and analyze semi-structured data from different sources (20 million+ products from 500+ websites into our catalog of 500 million+ products). We help our customers discover new patterns in their data that can be leveraged so that they can become more competitive and increase their revenue. Essential Functions: Think like our customers - you will work with product and engineering leaders to define intuitive solutions Designing customer-facing UI and back-end services for various business processes. Developing high-performance applications by writing testable, reusable, and efficient code. Implementing effective security protocols, data protection measures, and storage solutions. Improve the quality of our solutions - you will hold yourself and your team members accountable to writing high quality, well-designed, maintainable software Own your work - you will take responsibility to shepherd your projects from idea through delivery into production Bring new ideas to the table - some of our best innovations originate within the team Guiding and mentoring others on the team Technologies We Use: Languages: NodeJS/NestJS/Typescript, SQL, React/Redux, GraphQL Infrastructure: AWS, Docker, Kubernetes, Terraform, GitHub Actions, ArgoCD Databases: Postgres, MongoDB, Redis, Elasticsearch, Trino, Iceberg Streaming and Queuing: Kafka, NATS, Keda Qualifications 6+ years of professional software engineering/development experience. Proficiency with architecting and delivering solutions within a distributed software platform Full stack engineering experience, including front end frameworks (React/Typescript, Redux) and backend technologies such as NodeJS/NestJS/Typescript, GraphQL Proven ability to learn quickly, make pragmatic decisions, and adapt to changing business needs Proven ability to work and effectively, prioritize and organize your work in a highly dynamic environment Proven track record of working in highly distributed event driven systems. Strong proficiency working of RDMS/NoSQL/Big Data solutions (Postgres, MongoDB, Trino, etc.) Solid understanding of Data Pipeline and Workflow Automation - orchestration tools, scheduling and monitoring Solid understanding of ETL/ELT and OLTP/OLAP concepts Solid understanding of Data Lakes, Data Warehouses, and modeling practices (Data Vault, etc.) and experience leveraging data lake solutions (e.g. AWS Glue, DBT, Trino, Iceberg, etc.) . Ability to clean, transform, and aggregate data using SQL or scripting languages Ability to design and estimate tasks, coordinate work with other team members during iteration planning Solid understanding of AWS, Linux and infrastructure concepts Track record of lifting and challenging teammates to higher levels of achievement Experience measuring, driving and improving the software engineering process Good testing habits and strong eye for quality. Outstanding organizational, communication, and relationship building skills conducive to driving consensus; able to work well in a cross-functional environment Experience working in an agile team environment Ownership - feel a sense of personal accountability/responsibility to drive execution from start to finish. Drive adoption of Wisers Product Delivery organization principles across the department. Bonus Points Experience with CQRS Experience with Domain Driven Design Experience with C4 modeling Experience working within a retail or ecommerce environment Experience with AI Coding Agents (Windsurf, Cursor, Claude, ChatGPT, etc) - Prompt Engineering Why Join Wiser Solutions? Work on an industry-leading product trusted by top retailers and brands. Be at the forefront of pricing intelligence and data-driven decision-making. A collaborative, fast-paced environment where your impact is tangible. Competitive compensation, benefits, and career growth opportunities.
Posted 3 weeks ago
3.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title: SSIS Developer Location: Hyderabad, Bangalore, Chennai, Pune Experience: 7- 12 Years Key Responsibilities: Design, develop, and deploy robust ETL processes using SSIS. Develop OLAP cubes and tabular models using SSAS for business intelligence reporting. Build and optimize data models to support analytical and reporting needs. Design and implement data warehouse architectures and schemas (Star, Snowflake). Perform data integration, cleansing, and transformation from multiple sources. Ensure data quality and consistency across all platforms. Work closely with business analysts and stakeholders to understand data requirements. Monitor and improve ETL performance and resolve any issues in data pipelines. Document technical processes and data flows. Required Skills & Qualifications: 3+ years of hands-on experience with SSIS and SSAS. Strong understanding of data modeling concepts (conceptual, logical, and physical models). Proven experience in designing and implementing data warehouses. Proficient in T-SQL and SQL Server. Experience with performance tuning of ETL processes. Familiarity with BI tools like Power BI, Tableau (preferred but not mandatory). Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Preferred Qualifications: Experience with Azure Data Factory, Synapse Analytics, or other cloud-based data services. Knowledge of data governance and data quality best practices. Bachelors degree in Computer Science, Information Systems, or a related field.
Posted 3 weeks ago
4.0 - 6.0 years
4 - 8 Lacs
Chennai
Work from Office
Job_Description":" AI & Data Warehouse (DWH) Pando is a global leader in supply chain technology, building the worlds quickest time-to-value Fulfillment Cloud platform. Pando\u2019s Fulfillment Cloud provides manufacturers, retailers, and 3PLs with a single pane of glass to streamline end-to-end purchase order fulfillment and customer order fulfillment to improve service levels, reduce carbon footprint, and bring down costs. As a partner of choice for Fortune 500 enterprises globally, with a presence across APAC, the Middle East, and the US, Pando is recognized as a Technology Pioneer by the World Economic Forum (WEF), and as one of the fastest growing technology companies by Deloitte. Role As a Senior AI and Data Warehouse Engineer at Pando, you will be responsible for building and scaling the data and AI services team. You will drive the design and implementation of highly scalable, modular, and reusable data pipelines, leveraging big data technologies and low-code implementations. This is a senior leadership position where you will work closely with cross-functional teams to deliver solutions that power advanced analytics, dashboards, and AI-based insights. Key Responsibilities - Lead the development of scalable, high-performance data pipelines using PySpark or Big Data ETL pipeline technologies. - Drive data modeling efforts for analytics, dashboards, and knowledge graphs. - Oversee the implementation of parquet-based data lakes. - Work on OLAP databases, ensuring optimal data structure for reporting and querying. - Architect and optimize large-scale enterprise big data implementations with a focus on modular and reusable low-code libraries. - Collaborate with stakeholders to design and deliver AI and DWH solutions that align with business needs. - Mentor and lead a team of engineers, building out the data and AI services organization. Requirements - 4 to 6 years of experience in big data and AI technologies, with expertise in PySpark or similar Big Data ETL pipeline technologies. - Strong proficiency in SQL and OLAP database technologies. - Firsthand experience with data modeling for analytics, dashboards, and knowledge graphs. - Proven experience with parquet-based data lake implementations. - Expertise in building highly scalable, high-volume data pipelines. - Experience with modular, reusable, low-code-based implementations. - Involvement in large-scale enterprise big data implementations. - Initiative-taker with strong motivation and the ability to lead a growing team. Preferred - Experience leading a team or building out a new department. - Experience with cloud-based data platforms and AI services. - Familiarity with supply chain technology or fulfilment platforms is a plus.
Posted 3 weeks ago
15.0 - 20.0 years
50 - 55 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
About Netskope Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter @Netskope . About the team: We are a distributed team of passionate engineers dedicated to continuous learning and building impactful products. Our core product is built from the ground up in Scala, leveraging the Lightbend stack (Play/Akka). About Data Security Posture Management (DSPM) DSPM is designed to provide comprehensive data visibility and contextual security for the modern AI-driven enterprise. Our platform automatically discovers, classifies, and secures sensitive data across diverse environments including AWS, Azure, Google Cloud, Oracle Cloud , and on-premise infrastructure. DSPM is critical in empowering CISOs and security teams to enforce secure AI practices, reduce compliance risk, and maintain continuous data protection at scale. As a key member of the DSPM team, you will contribute to developing innovative and scalable systems designed to protect the exponentially increasing volume of enterprise data. Our platform continuously maps sensitive data across all major cloud providers, and on-prem environments, automatically detecting and classifying sensitive and regulated data types such as PII, PHI, financial, and healthcare information. It flags data at risk of exposure, exfiltration, or misuse and helps prioritize issues based on sensitivity, exposure, and business impact. What youll do: Drive the enhancement of Data Security Posture Management (DSPM) capabilities, by enabling the detection of sensitive or risky data utilized in (but not limited to) training private LLMs or accessed by public LLMs. Improve the DSPM platform to extend support of the product to all major cloud infrastructures, on-prem deployments, and any new upcoming technologies. Provide technical leadership in all phases of a project, from discovery and planning through implementation and delivery. Contribute to product development: understand customer requirements and work with the product team to design, plan, and implement features. Support customers by investigating and fixing production issues. Help us improve our processes and make the right tradeoffs between agility, cost, and reliability. Collaborate with the teams across geographies. What you bring to the table: 15+ years of software development experience with enterprise-grade software. Must have experience in building scalable, high-performance cloud services. Expert coding skills in Scala or Java. Development on cloud platforms including AWS. Deep knowledge on databases and data warehouses (OLTP, OLAP) Analytical and troubleshooting skills. Experience working with Docker and Kubernetes. Able to multitask and wear many hats in a fast-paced environment. This week you might lead the design of a new feature, next week you are fixing a critical bug or improving our CI infrastructure. Cybersecurity experience and adversarial thinking is a plus. Expertise in building REST APIs. Experience leveraging cloud-based AI services (e.g., AWS Bedrock, SageMaker), vector databases, and Retrieval Augmented Generation (RAG) architectures is a plus. Education: Bachelors degree in Computer Science. Masters degree strongly preferred. #LI-SK3
Posted 3 weeks ago
3.0 - 8.0 years
9 - 19 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Description: ODI/OBIEE/OACS Developer Experience: 3- 8 Years Job Location: Pan India Job Description: ODI: Strong IT experience in development and implementation of Business Intelligence and Data warehousing solutions using ODI. Knowledge of Analysis Design, Development, Customization, Implementation & Maintenance of Oracle Data Integrator (ODI). Experience in Designing, implementing, and maintaining ODI load plan and process. Working knowledge of ODI, PL/SQL, TOAD Data Modelling logical / Physical, Star/Snowflake, Schema, FACT & Dimensions tables, ELT,OLAP. OBIEE/OACS: Strong experience with OBIEE and Oracle Analytics Cloud (OAC) Any knowledge on OBIA (BI Apps), ODI, Oracle autonomous Datawarehouse (ADW), OTBI is a plus Must have good understand on Oracle Data Visualization, user stories, data model development (with or without RPD) Experience in configuring OBIEE /OACS security (Authentication and Authorization Object level and Data level security) Strong oral and written communication skills to influence others, as well as ability to think clearly, analyze quantitatively, and prioritize with sound judgement. Expert in using Oracle analytics (answers/DV) for DV Projects or dashboards Good experience with Sal, PL/SQL, & database functions
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
VAM Systems is a Business Consulting, IT Solutions and Services company. VAM Systems is currently looking for Data Engineering Analyst for our Bahrain operations with the following skillsets & terms and conditions: Qualifications : · Bachelor’s Degree · Engineer (B.E.)/ MCA · Certification in SQL/SAS Experience : 5-8 years Key Objectives · Support the finance team on data and analytics activities and Dataware House (DWH) based on a profound knowledge of banking, financial reporting, and data engineering. Analytical/Technical Skills: · Understanding finance and risk reporting systems/workflow with previous experience participating in system implementation is desirable. · Hands-on experience on MS Excel · Prior project management/stakeholder management is desired Responsibilities · Coordinate and Interact with the finance business partner to support daily finance data analysis, hierarchical mappings, and understanding (root cause analysis) of identified data issues. · Exceptional comprehension of finance, risk, and data warehousing to guarantee accurate and reconciled reporting (e.g., balance-sheet exposure, profit and loss). · Mastering the Intersection of Finance, Data Analysis and Data Engineering. · Conduct review of data quality and reconciliations for finance reports and maintenance of reporting logic/programs. · Support the finance team in ad-hoc requests and organizing data for financial/regulatory reports, data mapping and performing UAT. · Ensuring the consistency of bank's data architecture, data flows, and business logic in accordance with Data Management guidelines, development standards, and data architecture by working closely with Finance and data Engineering teams to identify issues and develop sustainable data-driven solutions. · Expertise in writing and documenting complex SQL Query, Procedures, and functions creating algorithms that automate important financial interactions and data controls. · Experience in handling SAS ETL jobs, data transformation, validation, analysis and performance tuning. · SAS skillset, with Strong Experience in SAS Management Console, SAS DI, SAS Enterprise Guide, Base SAS, SAS Web Report Studio, SAS Delivery Portal, SAS OLAP Cube Studio, SAS Information Maps, SAS BI, SAS Stored Process, SAS Datasets & Library Terms and conditions Joining time frame: (15 - 30 days) The selected candidates shall join VAM Systems – Bahrain and shall be deputed to one of the leading bank in Bahrain. Should you be interested in this opportunity, please send your latest resume at the earliest at ashiq.salahudeen@vamsystems.com
Posted 3 weeks ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role The Business Intelligence analyst will support the ongoing design and development of dashboards, reports, and other analytics studies or needs. To be successful in the role you’ll need to be intellectually curious, detail-oriented, open to new ideas, and possess data skills and a strong aptitude for quantitative methods. The role requires strong SQL skills a wide experience using BI visualization tools like Tableau and PowerBI Your Role Accountabilities With the support of other analysis and technical teams, collect and analyze stakeholders’ requirements. Responsible for developing interactive and user-friendly dashboards and reports, partnering with UI/UX designers. Be experienced in BI tools like powerBi, Tableau, Looker, Microstrategy and Business Object and be capable and eager to learn new and other tools Be able to quickly shape data into reporting and analytics solutions Work with the Data and visualization platform team on reporting tools actualizations, understanding how new features can benefit our stakeholders in the future, and adapting existing dashboards and reports Have knowledge of database fundamentals such as multidimensional database design, relational database design, and more Qualifications & Experiences 2+ years of experience working with BI tools or any data-specific role with a sound knowledge of database management, data modeling, business intelligence, SQL querying, data warehousing, and online analytical processing (OLAP) Skills in BI tools and BI systems, such as Power BI, SAP BO, Tableau, Looker, Microstrategy, etc., creating data-rich dashboards, implementing Row-level Security (RLS) in Power BI, writing DAX expressions, developing custom BI products with scripting and programming languages such as R, Python, etc. In-depth understanding and experience with BI stacks The ability to drill down on data and visualize it in the best possible way through charts, reports, or dashboards Self-motivated and eager to learn Ability to communicate with business as well as technical teams Strong client management skills Ability to learn and quickly respond to rapidly changing business environment Have an analytical and problem-solving mindset and approach Not Required But Preferred Experience BA/BS or MA/MS in design related field, or equivalent experience (relevant degree subjects include computer science, digital design, graphic design, web design, web technology) Understanding of software development architecture and technical aspects How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.
Posted 3 weeks ago
0 years
4 - 8 Lacs
Hyderābād
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Lead Consultant – Snowflake & Informatica Cloud Data Engineer. Responsibilities Design and implement scalable data solution in Snowflake following the data engineering best practices and layered architecture. Design and implement scalable data pipelines and ETL/ELT processes using dbt, integrated with Snowflake for modern cloud data warehousing Develop and optimize transformation logic and storage structures in Snowflake using SQL, Python, and Airflow Collaborate with business and technical teams to translate data requirements into robust dbt on Snowflake integration solutions Ensure data quality, security, and compliance by applying governance best practices across data transformation pipelines and within the Snowflake environments Perform performance tuning in Snowflake and streamline ETL pipelines for efficient execution, supported by clear documentation of architecture and integration patterns Qualifications we seek in you! Minimum Qualifications Bachelor's degree in information science, data management, computer science or related field preferred Must have experience in Cloud Data Engineering domain Proven experience in cloud data engineering using Snowflake and Informatica, with hands-on delivery of end-to-end data pipeline implementations Strong knowledge of data warehousing, ELT/ETL design, OLAP concepts, and dimensional modelling using Snowflake, with experience in projects delivering complete data solutions Hands-on expertise in developing, scheduling, and orchestrating scalable ETL/ELT pipelines using Informatica Cloud or PowerCenter Proficiency in Python for data transformation and automation tasks integrated with Snowflake environments Excellent communication and documentation skills, with the ability to clearly articulate Snowflake architectures and Informatica workflows Experience implementing data quality, lineage, and governance frameworks using Informatica and Snowflake capabilities Familiarity with CI/CD practices for deploying Informatica workflows and Snowflake objects within DevOps environments Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 8, 2025, 11:34:27 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 3 weeks ago
6.0 years
0 Lacs
Noida
On-site
Hello! You've landed on this page, which means you're interested in working with us. Let's take a sneak peek at what it's like to work at Innovaccer. Engineering at Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team that's turning dreams of better healthcare into reality, one line of code at a time. Together, we’re shaping the future and making a meaningful impact on the world. Join Us in Transforming Healthcare with the Power of Data & AI We’re on a mission to completely change the way healthcare works by building the most powerful Healthcare Intelligence Platform ever made. Using an AI-first approach , our goal is to turn complicated health data into real-time insights that help hospitals, clinics, pharmaceutical companies, and researchers make faster, smarter decisions. We're building a unified platform from the ground up — specifically for healthcare . This platform will bring together everything from: Collecting data from different systems (Data Acquisition) Combining and cleaning it (Integration, Data Quality) Managing patient records and provider info (Master Data Management) Tagging and organizing it (Data Classification & Governance) Running analytics and building AI models (Analytics, AI Studio) Creating custom healthcare apps (App Marketplace) Using AI as a built-in assistant (AI as BI + Agent-first approach) This platform will let healthcare teams build solutions quickly — without starting from scratch each time. For example, they’ll be able to: Track and manage kidney disease patients across different hospitals Speed up clinical trials by analyzing real-world patient data Help pharmacies manage their stock better with predictive supply chain tools Detect early signs of diseases like diabetes or cancer with machine learning Ensure regulatory compliance automatically through built-in checks This is a huge, complex, and high-impact challenge , and we’re looking for an SDE III to help lead the way. In this role, you’ll: Design and build scalable, secure, and reliable systems Create core features like data quality checks , metadata management , data lineage tracking , and privacy/compliance layers Work closely with other engineers, product managers, and healthcare experts to bring the platform to life If you're passionate about using technology to make a real difference in the world — and enjoy solving big engineering problems — we'd love to connect. Your Role We at Innovaccer are looking for a Software Development Engineer-III (Backend) to build the most amazing product experience. You’ll get to work with other engineers to build a delightful feature experience to understand and solve our customers’ pain points A Day in the Life Building efficient and reusable applications and abstractions. Identify and communicate back-end best practices. Participate in the project life-cycle from pitch/prototyping through definition and design to build, integration, QA, and delivery Analyse and improve the performance, scalability, stability, and security of the product Improve engineering standards, tooling, and processes What You Need 6+ years of experience with a start-up mentality and a high willingness to learn Expert in Python, Go, or Java, and experience with any web framework (relevant to the language) Aggressive problem diagnosis and creative problem-solving skills Expert in Kubernetes and containerization. Experience in RDBMS & NoSQL databases such as Postgres and MongoDB, (any OLAP database is good to have). Experience in Solution Architecture. Experience in cloud service providers such as AWS or Azure. Experience in Kafka, RabbitMQ, or other queuing services is good to have. Working experience in Big Data / Distributed Systems and Async Programming is a must-have. Bachelor's degree in Computer Science/Software Engineering. What We Offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days. Parental Leave: Experience one of the industry's best parental leave policies to spend time with your new addition. Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered. Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury. Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your child's well-being first. *India offices Where and how we work Our Noida office is situated in a posh techspace, equipped with various amenities to support our work environment. Here, we follow a five-day work schedule, allowing us to efficiently carry out our tasks and collaborate effectively within our team. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. About Innovaccer Innovaccer Inc. is the data platform that accelerates innovation. The Innovaccer platform unifies patient data across systems and care settings and empowers healthcare organizations with scalable, modern applications that improve clinical, financial, operational, and experiential outcomes. Innovaccer’s EHR-agnostic solutions have been deployed across more than 1,600 hospitals and clinics in the US, enabling care delivery transformation for more than 96,000 clinicians, and helping providers work collaboratively with payers and life sciences companies. Innovaccer has helped its customers unify health records for more than 54 million people and generate over $1.5 billion in cumulative cost savings. The Innovaccer platform is the #1 rated Best-in-KLAS data and analytics platform by KLAS, and the #1 rated population health technology platform by Black Book. For more information, please visit innovaccer.com. Check us out on YouTube, Glassdoor, LinkedIn, and innovaccer.com. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details.
Posted 3 weeks ago
0 years
4 - 8 Lacs
Calcutta
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Lead Consultant – Snowflake & Informatica Cloud Data Engineer. Responsibilities Design and implement scalable data solution in Snowflake following the data engineering best practices and layered architecture. Design and implement scalable data pipelines and ETL/ELT processes using dbt, integrated with Snowflake for modern cloud data warehousing Develop and optimize transformation logic and storage structures in Snowflake using SQL, Python, and Airflow Collaborate with business and technical teams to translate data requirements into robust dbt on Snowflake integration solutions Ensure data quality, security, and compliance by applying governance best practices across data transformation pipelines and within the Snowflake environments Perform performance tuning in Snowflake and streamline ETL pipelines for efficient execution, supported by clear documentation of architecture and integration patterns Qualifications we seek in you! Minimum Qualifications Bachelor's degree in information science, data management, computer science or related field preferred Must have experience in Cloud Data Engineering domain Proven experience in cloud data engineering using Snowflake and Informatica, with hands-on delivery of end-to-end data pipeline implementations Strong knowledge of data warehousing, ELT/ETL design, OLAP concepts, and dimensional modelling using Snowflake, with experience in projects delivering complete data solutions Hands-on expertise in developing, scheduling, and orchestrating scalable ETL/ELT pipelines using Informatica Cloud or PowerCenter Proficiency in Python for data transformation and automation tasks integrated with Snowflake environments Excellent communication and documentation skills, with the ability to clearly articulate Snowflake architectures and Informatica workflows Experience implementing data quality, lineage, and governance frameworks using Informatica and Snowflake capabilities Familiarity with CI/CD practices for deploying Informatica workflows and Snowflake objects within DevOps environments Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Kolkata Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 8, 2025, 11:22:19 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 3 weeks ago
3.0 - 7.0 years
18 - 34 Lacs
Hyderabad
Work from Office
Hiring Data Engineers (4+ yrs) in Bengaluru/Hyderabad. Design ETL pipelines, build data lakes/warehouses, ensure data quality. Skills: Python, SQL, Airflow, Kafka, BigQuery, Spark, AWS/GCP. Work with analysts, PMs, ML teams. Health insurance Provident fund
Posted 3 weeks ago
5.0 years
0 Lacs
India
Remote
Job Summary We are looking for a highly skilled Data Engineer to join our team on a 6-month remote full-time contract basis. We are seeking an expert in Python data engineering who can build and maintain our data infrastructure and also be able to develop user-friendly web interfaces for our existing Python codebase. The ideal candidate will be a proactive problem-solver, adept at bridging the gap between raw data and actionable insights through both custom applications and powerful Power BI dashboards. Main responsibilities: • Design, build, and optimize scalable ETL pipelines using Python and SQL Server Integration Services (SSIS) • Develop and integrate functional and intuitive web-based user interfaces for our existing Python applications using frameworks like Django, FastAPI or similar • Create, deploy, and manage insightful and interactive dashboards and reports using Power BI to meet diverse business needs • Ensure performance, reliability, and data integrity across systems and dashboards. • Document data workflows and monitor ETL pipelines Key Requirements : • Proven experience as a Data Engineer with a strong portfolio of successful data projects • Expertise in Python for data engineering, including deep knowledge of libraries such as Pandas, NumPy, and SQLAlchemy • Demonstrable experience developing web user interfaces with Python frameworks like Django or FastAPI • Strong proficiency in Power BI dashboard development, including data modeling, DAX queries, and visualization best practices • Solid experience with Microsoft SQL Server and SQL Server Integration Services • Excellent communicator, must be able to explain technical jargon in layman terms • Self-starter, flexible, can-do attitude, capable of multitasking, structured and works well within a team environment. • Ability to work with minimal guidance or supervision in a time critical environment. Preferred Skills – Not Mandatory: • Exposure to Azure Synapse Analytics, Azure Data Factory, or other Azure data services • Experience with cloud-based data architectures, data lakes, or big data platforms • A strong background in data warehousing concepts and best practices (e.g., dimensional modeling, ETL/ELT patterns) is highly welcome. • Understanding of OLAP cube development using SQL Serve Analysis Services (SSAS) Contract Details: • Type: Contract (6 months, with potential for extension) • Remote: 100% remote • Working Hours: Must be willing to work in the standard work shift : 8 AM – 5 PM, GMT+3 – Qatar time Qualifications • Bachelor’s Degree in IT, Computer Science or related field is desired but not mandatory • Minimum 5 years’ experience in data warehousing, reporting and analytics related fields • Professional certificates in analytics, data engineering or data warehousing will be a plus Other Requirements • Candidate must adhere to our Ethics and Code of Conduct • Work timings may change as per our business requirements and the candidate is expected to adhere to the same. • Candidate will be bound to our Information Security policies Work Location: Online Project Type: One Time Expected Start Date: 17 Aug, 2025 Expected End Date: 28 Feb, 2026
Posted 3 weeks ago
4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Roles & Responsibilities At least 4 years of professional database development (SQL Server or Azure SQL Database) and ETL using Talend and SSIS in an OLAP/OLTP environments Bachelor’s or master’s degree in computer science or equivalent preferred Experience in continuous delivery environments Experience with Agile (SCRUM/Kanban) software development methodologies. Automated testing and deployment implementation a plus. Experience deploying and managing in-house/cloud-hosted data solutions Experience with large scale systems involving reporting, transactional systems and integration with other enterprise systems. Experience with Source/Version control systems. Successful history of working with high performing technology teams Technical Skills Proficiency with multiple ETL tools including Talend and SSIS (Azure Data Factory is bonus) Proficiency in SQL Development in Microsoft SQL Server (Azure SQL Databases is bonus) Experience with SQL Query Performance Optimization Experience with industry development standards and their implementation Proficiency in system analysis and design Analysis and verification technical requirements for completeness, consistency, feasibility, and testability. Identification and Resolution of technical issues through unit testing, debugging & investigation Version Control including branching and merging using services like GitHub or AzureDevOps Experience 4.5-6 Years Skills Primary Skill: SQL Development Sub Skill(s): SQL Development Additional Skill(s): SQL Development, Oracle PL/SQL Development, postgreSQL Development, ETL, SQL, Talend About The Company Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP). Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.
Posted 3 weeks ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a " Platform Engineer- BI " at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unapparelled customer experiences. You may be assessed on the key critical skills relevant for success in role, such as risk and control, change and transformations, business acumen, strategic thinking, and digital technology and as well as job-specific skillsets. To be successful as a "Platform Engineer- BI", you should have experience with: Basic/ Essential Qualifications Strong understanding of SAP BO architecture, deployment strategies for higher availability Install, configure, and maintain SAP BusinessObjects platform (e.g., BI Platform, Web Intelligence, Crystal Reports, Analysis for OLAP/Office). Develop custom Java applications and scripts to extend SAP BO functionality. Integrate SAP BO with other enterprise systems using Java APIs and SDKs. Create and maintain custom web applications for reporting and data analysis. Provide technical support to end-users and resolve SAP BO related issues. Writing efficient, reusable, and testable code in Python Collaborating with cross-functional teams, including front-end developers, to design and implement software features Desirable Skillsets/ Good To Have At least 4+ years work experience across, in SAP BO Admin including Java, in any Python web framework, like Django, Flask, or Pyramid Experience with PowerBI Administration will be an added advantage. Deep understanding of multi-process (define, design, and create) cloud architecture projects and the threading limitations of Python Experience in Agile software development Familiarity with ORM libraries Knowledge of popular Python libraries and frameworks Ability to test and debug tools Experience in OOP/functional coding Professional certifications in similar area will be added advantage but not mandatory. This role will be based out of Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window)
Posted 3 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Data Model Specialist Location: Work From Office – Chennai Employment Type: Contract About The Role As a Data Model Specialist, you will focus on designing, validating, and maintaining efficient data models that align with enterprise data strategies. You’ll work closely with engineers, analysts, and business teams to ensure the accuracy and usability of our data assets. Responsibilities Design scalable and consistent data models for transactional and analytical systems Review and enhance existing models for performance and accuracy Maintain documentation and enforce data modeling standards Support integration of data across platforms and tools Provide guidance on best practices for modeling complex business domains Requirements 5+ years of experience in data modeling, data architecture, or database design Advanced skills in SQL and data modeling software Understanding of OLTP, OLAP, and data lake environments Strong problem-solving and analytical skills Familiarity with metadata and data lineage tools Nice To Have Experience in cross-functional data projects Exposure to data governance and data quality frameworks Skills: skills,olap,database design,data lineage tools,data models,oltp,data architecture,sql,data modeling software,metadata tools,problem-solving,analytical skills,data lake environments,data modeling
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France