Jobs
Interviews

25068 Etl Jobs - Page 45

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

10 Lacs

Noida

On-site

We, at Beesolver Technologies looking for an experienced Senior Laravel/PHP Developer with a strong foundation in backend development, MySQL, and e-commerce systems. The ideal candidate will be responsible for designing scalable applications, integrating data pipelines, and driving feature delivery in a collaborative Agile team setup. Job Title: PHP Laravel Developer – Ecommerce & Data Integration Job Type: Full-Time Experience: 5+ Years Location: Noida/Chd.-Mohali WORK TIMING: 10 am-7 PM Industry: IT Services / E-commerce / SaaS Functional Area: Software Development Roles and Responsibilities: Develop secure and scalable backend applications using Laravel (PHP) Design, optimize, and manage MySQL schemas and performance tuning Build and maintain ETL/ESB pipelines for data synchronization across systems Work with Laravel queues, events, jobs, and scheduler Develop REST APIs and manage third-party integrations (shipping, CRM, payments) Collaborate with cross-functional Agile teams: developers, testers, product owners Implement and follow best practices in code quality, testing, and CI/CD Technical Skills: 5+ years of PHP development experience (3+ years in Laravel) Strong experience with MySQL (joins, indexes, stored procedures, tuning) ETL/ESB knowledge using Laravel Jobs, Talend, or Apache NiFi Skilled in REST API design, integration, and OAuth/Webhooks Ecommerce Domain Knowledge Hands-on experience with major ecommerce platforms Strong understanding of ecommerce business processes including: Product catalogs, variants, and SKU management Order lifecycle management (cart, checkout, order placement, payment, fulfilment, return/refund) Inventory management, stock sync, and warehouse integration Shipping and logistics API integrations ERP and CRM system integration for unified data flow Knowledge of ecommerce KPIs and data reporting (RFM, CLTV, conversion rate) Preferred Skills (Good to Have) Experience with RabbitMQ, Kafka, or any messaging system Exposure to Talend, Apache NiFi, or Pentaho Familiarity with DDD, clean/hexagonal architecture patterns Basic experience on cloud platforms: AWS, Azure, or GCP Strong experience with MySQL (joins, indexes, stored procedures, tuning) ETL/ESB knowledge using Laravel Jobs, Talend, or Apache NiFi Skilled in REST API design, integration, and OAuth/Webhooks Education: UG: B. Tech/B.E. in Computer Science, IT or BCA PG: MCA, M. Tech (preferred) Job Types: Full-time, Permanent Pay: Up to ₹89,785.33 per month Benefits: Paid time off Provident Fund Schedule: Day shift Supplemental Pay: Yearly bonus Work Location: In person

Posted 1 week ago

Apply

5.0 years

1 - 10 Lacs

Noida

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a talented and motivated Data Engineer to join our growing data team. You will play a key role in building scalable data pipelines, optimizing data infrastructure, and enabling data-driven solutions. Primary Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines for batch and real-time data processing Build and optimize data models and data warehouses to support analytics and reporting Collaborate with analysts and software engineers to deliver high-quality data solutions Ensure data quality, integrity, and security across all systems Monitor and troubleshoot data pipelines and infrastructure for performance and reliability Contribute to internal tools and frameworks to improve data engineering workflows Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: 5+ years of experience working on commercially available software and / or healthcare platforms as a Data Engineer 3+ years of solid experience designing and building Enterprise Data solutions on cloud 1+ years of experience developing solutions hosted within public cloud providers such as Azure or AWS or private cloud/container-based systems using Kubernetes/OpenShift Experience with some of the modern relational databases Experience with Data warehousing services preferably Snowflake Experience in using modern software engineering and product development tools including Agile / SAFE, Continuous Integration, Continuous Delivery, DevOps etc. Solid experience of operating in a quickly changing environment and driving technological innovation to meet business requirements Skilled at optimizing SQL statements Subject matter expert on Cloud technologies preferably Azure and Big Data ecosystem Preferred Qualifications: Experience with real-time data streaming and event-driven architectures Experience building Big Data solutions on public cloud (Azure) Experience building data pipelines on Azure with skills Databricks spark, scala, Azure Data factory, Kafka and Kafka Streams, App services, Az Functions Experience developing RESTful Services in .NET, Java or any other language Experience with DevOps in Data engineering Experience with Microservices architecture Exposure to DevOps practices and infrastructure-as-code (e.g., Terraform, Docker) Knowledge of data governance and data lineage tools Ability to establish repeatable processes, best practices and implement version control software in a Cloud team environment At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 1 week ago

Apply

3.0 years

4 - 6 Lacs

Noida

On-site

Date: Aug 1, 2025 Company: Location: NOIDA, IN, 201301 Noida, IN, 201301 . Customer Support – BI & Reporting Analyst Job Summary: We are looking for a skilled and analytical BI & Reporting Analyst to support our Customer Support and team through advanced reporting, visualization, and data insights. This role is critical in translating ITIL-aligned support data into actionable dashboards and reports to drive performance, SLA adherence, and continuous improvement. Key Responsibilities: Design, build, and maintain interactive dashboards in Power BI that visualize KPIs across Incident, Problem, Change, and Request Management processes. Collaborate with ITSM and Customer Support teams to identify reporting needs aligned with ITIL practices and service goals. Translate raw data from ITSM tools (e.g., Jira Service Management, ServiceNow, BMC Remedy, ) into clean, structured datasets suitable for reporting. Provide insights into support performance, SLA compliance, ticket volumes, resolution times, backlog trends, and user satisfaction metrics. Develop data models, queries, and metrics that support operational and strategic decision-making. Ensure accuracy, consistency, and availability of real-time and historical data for dashboards and reports. Document and maintain data definitions, report logic, and dashboard usage guidelines. Support audits, compliance tracking, and executive reporting with on-demand and scheduled data visualizations. Continuously identify opportunities to automate reporting and improve data accessibility and storytelling. Required Qualifications: 3+ years of hands-on experience designing Power BI dashboards and reports, preferably in an IT or customer support-focused organization. Strong knowledge of ITIL frameworks and ITSM processes (especially Incident, Problem, and Change Management). Experience working with ITSM platforms such as Jira Service Management, ServiceNow, or BMC . Understanding of support operations, service metrics (SLAs, KPIs), and reporting requirements in customer support or service desk environments. Strong analytical thinking and attention to detail. Familiarity with Excel, SQL, and other data tools. Preferred Qualifications: ITIL Foundation Certification (v3 or v4). Experience with automated data pipelines or ETL tools. Experience integrating data from multiple systems (CRM, ITSM, HR systems, etc.). Familiarity with tools like Tableau or Excel VBA as secondary platforms. Our Culture & Values At Ingenico, we thrive on innovation, collaboration, and delivering customer value. Our values—Trust, Innovation, and Care—define how we work and grow together. We challenge the status quo, push boundaries, and deliver results as a team. Diversity & Inclusion Ingenico is proud to be an equal opportunity employer. We are committed to fostering an inclusive environment where every employee feels respected and empowered. Ready to Make an Impact? Join us and help shape the future of payments across Asia. Apply now. Learn more about Ingenico: Ingenico Global Website: https://www.ingenico.com Ingenico LinkedIn: https://www.linkedin.com/company/ingenico/

Posted 1 week ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Noida

On-site

The Position We are seeking a skilled Data Engineer to join our dynamic team. In this role, you will play a pivotal part in designing and implementing custom solutions that support complex financial and IP calculations, reporting, and data transformations. Your work will directly contribute to improving our clients' operational efficiency and decision-making capabilities. What you will do: Problem-Solving: Develop innovative solutions to complex challenges in financial calculations, rights management, and process optimization. Data Engineering Solutions: Design, build, and maintain scalable data pipelines for migration, cleansing, transformation, and integration tasks, ensuring high-quality data outcomes. Database Development & Maintenance: Configure, implement, and refine stored procedures and queries to ensure optimal performance, scalability, and maintainability of database systems. ETL & Data Migration: Develop robust ETL (Extract, Transform, Load) processes that integrate data from diverse sources, ensuring seamless migration and transformation for downstream analytics and reporting. Automation & Scripting: Create and implement automated scripts and tools to streamline routine database tasks, reduce manual intervention, and improve overall operational efficiency. Collaboration: Partner with cross-functional teams to align data engineering efforts with broader business objectives and deliver seamless solutions that drive value across the organization. IP Commerce Data Expertise: Leverage deep knowledge of financial and rights data to develop creative solutions that address client needs and advance business goals. Process Improvement: Continuously identify opportunities to optimize workflows, automate repetitive tasks, and enhance efficiency in data processing and delivery. What you will bring to the role : Must-Have: Minimum 3-5 years of experience in an database developer or analyst position. Bachelors in Computer Science, Engineering or equivalent work experience. Exceptional analytical thinking and problem-solving capabilities. Strong verbal and written communication skills with the ability to articulate technical concepts clearly. Proficiency in analyzing complex financial or IP data sets. Hands-on experience with engineering principles, including designing and implementing scalable solutions. Strong attention to detail and commitment to ensuring data accuracy and integrity. Preferred: Experience working with SQL and/or Python for data manipulation and analysis. Experience working in finance or IP-related industries, with an understanding of their unique challenges and requirements. Familiarity with handling large-scale datasets and cloud-based platforms (e.g., AWS, Azure, Google Cloud). Knowledge of DevOps practices and CI/CD pipelines to streamline database management and deployment. Understanding of data warehousing architectures and business intelligence tools for advanced analytics. Certifications in relevant database technologies (e.g., Microsoft Certified: Azure Database Administrator Associate or Oracle Certified Professional) are a bonus Shift - Flexible (US & UK shift) Equal Employment Opportunity Rightsline is an equal opportunity workplace. All candidates will be afforded equal opportunity through the recruiting process. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, disability, gender identity and/or expression. We are dedicated to growing a diverse team of highly talented individuals and creating an inclusive environment where everyone feels empowered to bring their authentic selves to work. Apply Today If you want to join a company that strives for a mission, purpose and making an impact, we encourage you to apply today.

Posted 1 week ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview PepsiCo Data BI & Integration Platforms is seeking an experienced Cloud Platform Databricks SME, responsible for overseeing the Platform administration, Security, new NPI tools integration, migrations, platform maintenance and other platform administration activities on Azure/AWS. The ideal candidate will have hands-on experience with Azure/AWS services - Infrastructure as Code (IaC), platform provisioning & administration, cloud network design, cloud security principles and automation. Responsibilities Databricks Subject Matter Expert (SME) plays a pivotal role in admin, security best practices, platform sustain support, new tools adoption, cost optimization, supporting new patterns/design solutions using the Databricks platform. Here’s a breakdown of typical responsibilities: Core Technical Responsibilities Architect and optimize big data pipelines using Apache Spark, Delta Lake, and Databricks-native tools. Design scalable data ingestion and transformation workflows, including batch and streaming (e.g., Kafka, Spark Structured Streaming). Create integration guidelines to configure and integrate Databricks with other existing security tools relevant to data access control. Implement data security and governance using Unity Catalog, access controls, and data classification techniques. Support migration of legacy systems to Databricks on cloud platforms like Azure, AWS, or GCP. Manage cloud platform operations with a focus on FinOps support, optimizing resource utilization, cost visibility, and governance across multi-cloud environments. Collaboration & Advisory Act as a technical advisor to data engineering and analytics teams, guiding best practices and performance tuning. Partner with architects and business stakeholders to align Databricks solutions with enterprise goals. Lead proof-of-concept (PoC) initiatives to demonstrate Databricks capabilities for specific use cases. Strategic & Leadership Contributions Mentor junior engineers and promote knowledge sharing across teams. Contribute to platform adoption strategies, including training, documentation, and internal evangelism. Stay current with Databricks innovations and recommend enhancements to existing architectures. Specialized Expertise (Optional but Valuable) Machine Learning & AI integration using MLflow, AutoML, or custom models. Cost optimization and workload sizing for large-scale data processing. Compliance and audit readiness for regulated industries. Qualifications Bachelor’s degree in computer science. At least 12 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 5 years in a Platform admin role Strong understanding of data security principles and best practices. Expertise in Databricks platform, security features, Unity Catalog, and data access control mechanisms. Experience with data classification and masking techniques. Strong understanding of cloud cost management, with hands-on experience in usage analytics, budgeting, and cost optimization strategies across multi-cloud platforms. Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Deep expertise in Azure/AWS big data & analytics technologies, including Databricks, real time data ingestion, data warehouses, serverless ETL, No SQL databases, DevOps, Kubernetes, virtual machines, web/function apps, monitoring and security tools. Deep expertise in Azure/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in Azure/AWS/Databricks platform administration, networking and security are preferred. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

We are seeking an 6+ years experienced Data Modeler to join our Data & Analytics (D&A) Service Line team. The ideal candidate will possess strong expertise in data modeling techniques (conceptual, logical, and physical) and have hands-on experience with Persistent Data Layer (PDL) and Logical Data Layer (LDL) design. A solid functional understanding of the payments domai covering payment gateways, cards, cashflows, and related processes is essential for this role. The Data Modeler will collaborate closely with business stakeholders, data architects, and technical teams to design robust and scalable data models that drive analytics, reporting, and operational efficiencies within the payments ecosystem. Design, develop, and maintain conceptual, logical, and physical data models aligned with business requirements in the payments domain. Develop Persistent Data Layer (PDL) and Logical Data Layer (LDL) schemas ensuring data integrity, consistency, and optimal performance. Leverage deep functional knowledge of payments systems, including payment gateways, card processing, cashflows, transaction flows, settlement, and reconciliation processes. Work with business analysts, data architects, and developers to translate business needs into data model designs that support analytics, BI, and reporting solutions. Ensure adherence to data governance policies, standards, and best practices, including data security and compliance requirements relevant to payments. Maintain comprehensive documentation of data models, data dictionaries, metadata, and technical specifications. Support ETL teams and data engineers in data mapping, lineage, and integration tasks related to payments data. Analyze and troubleshoot data-related issues and work towards continuous improvement of data models and related processes. Engage with cross-functional teams and external vendors to ensure alignment and understanding of payments data requirements.

Posted 1 week ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Please Find The Job Description For This Role Below – Candidate should have a mix of Business analytics and DE with at least 5-7 years of experience Candidate should have a strong understanding of ETL concepts, experience with Spark and should be good at data modeling and data warehousing concepts

Posted 1 week ago

Apply

56.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Join our transformation within the RMG Data Engineering team in Hyderabad and you will have the opportunity to work with a collaborative and dynamic network of technologists. Our teams play a pivotal role in implementing data products, creating impactful visualizations, and delivering seamless data solutions to downstream systems. At Macquarie, our advantage is bringing together diverse people and empowering them to shape all kinds of possibilities. We are a global financial services group operating in 31 markets and with 56 years of unbroken profitability. You’ll be part of a friendly and supportive team where everyone - no matter what role - contributes ideas and drives outcomes. What role will you play? In this role, you will apply your expertise in big data technologies and DevOps practices to design, develop, deploy, and support data assets throughout their lifecycle. You’ll establish templates, methods, and standards while managing deadlines, solving technical challenges, and improving processes. A growth mindset, passion for learning, and adaptability to innovative technologies will be essential to your success. What You Offer Hands-on experience building, implementing, and enhancing enterprise-scale data platforms. Proficiency in big data with expertise in Spark, Python, Hive, SQL, Presto, storage formats like Parquet, and orchestration tools such as Apache Airflow. Knowledgeable in cloud environments (preferably AWS), with an understanding of EC2, S3, Linux, Docker, and Kubernetes. ETL Tools: Proficient in Talend, Apache Airflow, DBT, and Informatica, AWS Glue. Data Warehousing: Experience with Amazon Redshift and Ateina. Kafka Development Engineering: Experience with developing and managing streaming data pipelines using Apache Kafka. We love hearing from anyone inspired to build a better future with us, if you're excited about the role or working at Macquarie we encourage you to apply. What We Offer Benefits At Macquarie, you’re empowered to shape a career that’s rewarding in all the ways that matter most to you. Macquarie employees can access a wide range of benefits which, depending on eligibility criteria, include: 1 wellbeing leave day per year 26 weeks’ paid maternity leave or 20 weeks’ paid parental leave for primary caregivers along with 12 days of paid transition leave upon return to work and 6 weeks’ paid leave for secondary caregivers Company-subsidised childcare services 2 days of paid volunteer leave and donation matching Benefits to support your physical, mental and financial wellbeing including comprehensive medical and life insurance cover, the option to join parental medical insurance plan and virtual medical consultations extended to family members Access to our Employee Assistance Program, a robust behavioural health network with counselling and coaching services Access to a wide range of learning and development opportunities, including reimbursement for professional membership or subscription Hybrid and flexible working arrangements, dependent on role Reimbursement for work from home equipment About Technology Technology enables every aspect of Macquarie, for our people, our customers and our communities. We’re a global team that is passionate about accelerating the digital enterprise, connecting people and data, building platforms and applications and designing tomorrow’s technology solutions. Our commitment to diversity, equity and inclusion We are committed to fostering a diverse, equitable and inclusive workplace. We encourage people from all backgrounds to apply and welcome all identities, including race, ethnicity, cultural identity, nationality, gender (including gender identity or expression), age, sexual orientation, marital or partnership status, parental, caregiving or family status, neurodiversity, religion or belief, disability, or socio-economic background. We welcome further discussions on how you can feel included and belong at Macquarie as you progress through our recruitment process. Our aim is to provide reasonable adjustments to individuals who may need support during the recruitment process and through working arrangements. If you require additional assistance, please let us know in the application process.

Posted 1 week ago

Apply

6.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview PepsiCo Data BI & Integration Platforms is seeking a Midlevel TIBCO Messaging (EMS, Business Works) Platform technology leader, responsible for overseeing the deployment, and maintenance of on-premises and cloud infrastructure (AWS/Azure) for its North America PepsiCo Foods/Beverages business. The ideal candidate will have hands-on experience in managing and maintaining TIBCO EMS (Enterprise Messaging System) and TIBCO Business Works (BW) platforms, ensuring system stability, security, and optimal performance including - Infrastructure as Code (IaC), platform provisioning & administration, network design, security principles and automation. Responsibilities TIBCO platform administration Install, configure, upgrade, and maintain TIBCO EMS servers and BW environments. Deploy and manage TIBCO applications, including BW projects and integrations Monitoring system health and performance, identifying and resolving issues, and ensuring smooth operation of business processes. Tuning system parameters, optimizing resource utilization, and ensuring the efficient operation of applications. Collaborating with development, QA, and other teams to resolve technical issues and ensure seamless integration of applications. Developing scripts and automating tasks for administration and maintenance purposes. Configuring and managing adapters for seamless integration with various systems. Developing and managing Hawk rulebases for monitoring BW engines, adapters, and log files Cloud Infrastructure & Automation Implement and support TIBCO application migration, modernization, and transformation projects, leveraging cloud-native technologies and methodologies. Implement TIBCO cloud infrastructure policies, standards, and best practices, ensuring cloud environment adherence to security and regulatory requirements. Design, deploy and optimize cloud-based TIBCO infrastructure using Azure/AWS services that meet the performance, availability, scalability, and reliability needs of our applications and services. Drive troubleshooting of TIBCO cloud infrastructure issues, ensuring timely resolution and root cause analysis by partnering with global cloud center of excellence & enterprise application teams, and PepsiCo premium cloud partners (Microsoft, AWS, TIBCO). Establish and maintain effective communication and collaboration with internal and external stakeholders, including business leaders, developers, customers, and vendors. Develop Infrastructure as Code (IaC) to automate provisioning and management of cloud resources. Write and maintain scripts for automation and deployment using PowerShell, Python, or Azure/AWS CLI. Work with stakeholders to document architectures, configurations, and best practices. Knowledge of cloud security principles around data protection, identity and access Management (IAM), compliance and regulatory, threat detection and prevention, disaster recovery and business continuity. Qualifications Bachelor’s degree in computer science. At least 6 to 8 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 4 years in a technical leadership role. Thorough knowledge of TIBCO EMS, BW, and related components (e.g., Adapters, Hawk). Strong understanding of Unix/Linux operating systems, as TIBCO products often run on these platforms. Proficiency in enterprise messaging concepts, including queues, topics, and message Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Strong expertise in Azure/AWS messaging technologies, real time data ingestion, data warehouses, serverless ETL, DevOps, Kubernetes, virtual machines, monitoring and security tools. Strong expertise in Azure/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in Azure/AWS platform administration, networking and security are preferred. TIBCO Certified Professional certifications (e.g., TIBCO EMS Administrator) are often desirable. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.

Posted 1 week ago

Apply

6.0 years

0 Lacs

India

On-site

About YipitData: YipitData is the leading market research and analytics firm for the disruptive economy and recently raised up to $475M from The Carlyle Group at a valuation over $1B. We analyze billions of alternative data points every day to provide accurate, detailed insights on ridesharing, e-commerce marketplaces, payments, and more. Our on-demand insights team uses proprietary technology to identify, license, clean, and analyze the data many of the world’s largest investment funds and corporations depend on. For three years and counting, we have been recognized as one of Inc’s Best Workplaces . We are a fast-growing technology company backed by The Carlyle Group and Norwest Venture Partners. Our offices are located in NYC, Austin, Miami, Denver, Mountain View, Seattle , Hong Kong, Shanghai, Beijing, Guangzhou, and Singapore. We cultivate a people-centric culture focused on mastery, ownership, and transparency. Why You Should Apply NOW: You’ll be working with many strategic engineering leaders within the company. You’ll report directly to the Director of Data Engineering. You will help build out our Data Engineering team presence in India. You will work with a Global team. You’ll be challenged with a lot of big data problems. About The Role: We are seeking a highly skilled Senior Data Engineer to join our dynamic Data Engineering team. The ideal candidate possesses 6-8 years of data engineering experience. An excellent candidate should have a solid understanding of Spark and SQL, and have data pipeline experience. Hired individuals will play a crucial role in helping to build out our data engineering team to support our strategic pipelines and optimize for reliability, efficiency, and performance. Additionally, Data Engineering serves as the gold standard for all other YipitData analyst teams, building and maintaining the core pipelines and tooling that power our products. This high-impact, high-visibility team is instrumental to the success of our rapidly growing business. This is a unique opportunity to be the first hire in this team, with the potential to build and lead the team as their responsibilities expand. This is a hybrid opportunity based in India. During training and onboarding, we will expect several hours of overlap with US working hours. Afterward, standard IST working hours are permitted with the exception of 1-2 days per week, when you will join meetings with the US team. As Our Senior Data Engineer You Will: Report directly to the Senior Manager of Data Engineering, who will provide significant, hands-on training on cutting-edge data tools and techniques. Build and maintain end-to-end data pipelines. Help with setting best practices for our data modeling and pipeline builds. Create documentation, architecture diagrams, and other training materials. Become an expert at solving complex data pipeline issues using PySpark and SQL. Collaborate with stakeholders to incorporate business logic into our central pipelines. Deeply learn Databricks, Spark, and other ETL toolings developed internally. You Are Likely To Succeed If: You hold a Bachelor’s or Master’s degree in Computer Science, STEM, or a related technical discipline. You have 6+ years of experience as a Data Engineer or in other technical functions. You are excited about solving data challenges and learning new skills. You have a great understanding of working with data or building data pipelines. You are comfortable working with large-scale datasets using PySpark, Delta, and Databricks. You understand business needs and the rationale behind data transformations to ensure alignment with organizational goals and data strategy. You are eager to constantly learn new technologies. You are a self-starter who enjoys working collaboratively with stakeholders. You have exceptional verbal and written communication skills. Nice to have: Experience with Airflow, dbt, Snowflake, or equivalent. What We Offer: Our compensation package includes comprehensive benefits, perks, and a competitive salary: We care about your personal life and we mean it. We offer vacation time, parental leave, team events, learning reimbursement, and more! Your growth at YipitData is determined by the impact that you are making, not by tenure, unnecessary facetime, or office politics. Everyone at YipitData is empowered to learn, self-improve, and master their skills in an environment focused on ownership, respect, and trust. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal-opportunity employer. Job Applicant Privacy Notice

Posted 1 week ago

Apply

0 years

0 Lacs

Vishakhapatnam, Andhra Pradesh, India

On-site

Company Description RASAPOORNA FOODS PRIVATE LIMITED is a company based in Visakhapatnam, Andhra Pradesh, India. We specialize in providing high-quality food products and services to our clients. Our company operates from its headquarters at HARI PRIYA HEAVEN, KRM COLONY, Seethammadharain Vishakhapatnam. We are dedicated to maintaining a high standard of excellence in our offerings and operations. Role Description We are seeking a full-time Power BI Consultant to join our team in Vishakhapatnam. This on-site role involves designing, developing, and maintaining Power BI dashboards and reports. You will be responsible for data modeling, creating ETL processes, and supporting data warehousing initiatives. The role includes analyzing business requirements, creating data visualizations, and providing insights to support decision-making. Qualifications Strong Analytical Skills Experience with Extract Transform Load (ETL) processes Proficiency in creating dashboards using Power BI Expertise in Data Modeling and Data Warehousing Excellent problem-solving and communication skills Ability to work independently and collaborate with cross-functional teams Bachelor’s degree in Computer Science, Information Technology, or a related field Experience in the food industry is a plus

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About SpurQLabs: SpurQLabs is an independent software testing and test automation company with a mission to help our clients build exceptional quality products at speed. We specialize in test automation, performance testing, API testing, and CI/CD enablement across industries including life sciences, pharmaceuticals, and regulated environments. Job Summary: We are seeking a detail-oriented ETL Test Engineer with strong Python, SQL, and AWS skills to validate data pipelines in compliance with GxP and FDA regulations . This role involves test planning, test execution, automated validation, and ensuring high-quality, audit-ready data workflows. Key Responsibilities: Develop and execute test plans, test cases, and test scripts to validate ETL processes, data migrations, and transformations according to GxP and industry standards. Conduct functional, integration, and regression testing across various data sources and targets to ensure accurate extraction, transformation, and loading. Collaborate with data engineers, business analysts, and stakeholders to understand data mappings, business logic, and compliance needs . Build and maintain automated ETL test suites using Python and testing frameworks for continuous validation of data pipelines. Perform data profiling and quality assessments , identify discrepancies, and work with stakeholders to resolve integrity issues. Document and report test outcomes, validation findings, and defects using defined templates and issue tracking tools. Participate in validation planning, execution, and documentation aligned with regulatory guidelines, GxP, FDA, and company SOPs. Ensure traceability, auditability, and data integrity across all validation activities. Stay current on industry trends, compliance updates, and best practices in ETL testing and data validation. Contribute to process improvement and knowledge sharing within the team. Technical Skills: Mandatory: Python : For automation of ETL validation SQL : Strong skills for data querying and validation AWS Cloud Services : Especially S3 and Databricks Snowflake : Hands-on experience with cloud data warehouse Nice to Have: Experience with automated ETL testing frameworks Familiarity with data compliance frameworks (GxP, FDA, Part 11) Exposure to validation documentation tools and issue tracking systems Qualifications: Bachelor’s degree in Computer Science, Engineering, Life Sciences, or a related field 3 to 6 years of hands-on experience in ETL testing in a regulated or data-intensive environment Experience in GxP-compliant environments is strongly preferred Strong communication, analytical, and problem-solving skills

Posted 1 week ago

Apply

8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

You are as unique as your background, experience and point of view. Here, you’ll be encouraged, empowered and challenged to be your best self. You'll work with dynamic colleagues - experts in their fields - who are eager to share their knowledge with you. Your leaders will inspire and help you reach your potential and soar to new heights. Every day, you'll have new and exciting opportunities to make life brighter for our Clients - who are at the heart of everything we do. Discover how you can make a difference in the lives of individuals, families and communities around the world. Job Description: Sun Life Job Description Of The Data Modeler Role The Data Modeler will work towards design and implementation of new data structures to support the project teams delivering on ETL, Data warehouse design, managing the enterprise data model, the maintenance of the data, and enterprise data integration approaches. Technical Responsibilities Build and maintain out of standards data models to report disparate data sets in a reliable, consistent and interpretable manner. Gather, distil and harmonize data requirements and to design coherent Conceptual, logical and physical data models and associated physical feed formats to support these data flows. Articulate business requirements and build source-to-target mappings having complex ETL transformation. Write complex SQL statements and profile source data to validate data transformations. Contribute to requirement analysis and database design - Transactional and Dimensional data modelling. Normalize/ De-normalize data structures, introduce hierarchies and inheritance wherever required in existing/ new data models. Develop and implement data warehouse projects independently. Work with data consumers and data suppliers to understand detailed requirements, and to propose standardized data models. Contribute to improving the Data Management data models. Be an influencer to present and facilitate discussions to understand business requirements and develop dimension data models based on these capabilities and industry best practices. Requirements Extensive practical experience in Information Technology and software development projects of with at least 8+ years of experience in designing Operational data store & data warehouse. Extensive experience in any of Data Modelling tools – Erwin/ SAP power designer. Strong understanding of ETL and data warehouse concepts processes and best practices. Proficient in Data Modelling including conceptual, logical and physical data modelling for both OLTP and OLAP. Ability to write complex SQL for data transformations and data profiling in source and target systems Basic understanding of SQL vs NoSQL databases. Possess a combination of solid business knowledge and technical expertise with strong communication skills. Demonstrate excellent analytical and logical thinking. Good verbal & written communication skills and Ability to work independently as well as in a team environment providing structure in ambiguous situation. Good to have Understanding of Insurance Domain Basic understanding of AWS cloud Good understanding of Master Data Management, Data Quality and Data Governance. Basic understanding of data visualization tools like SAS VA, Tableau Good understanding of implementing & architecting data solutions using the informatica, SQL server/Oracle Job Category: Advanced Analytics Posting End Date: 16/09/2025

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Hello! You've landed on this page, which means you're interested in working with us. Let's take a sneak peek at what it's like to work at Innovaccer. Growth Strategy Team at Innovaccer Innovaccer is forming a new strategic advisory team that will support healthcare organizations to better understand their opportunities and levers for maximizing outcomes, particularly in, but not limited to, value-based care arrangements and population health initiatives. This role requires a "full stack" approach to analytics, covering all parts of the analytics value chain, including data ETL and manipulation, analysis, reporting, visualizations, insights, and final deliverable creation. The ideal candidate will possess a player / coach mentality as this team matures, with the willingness and ability to roll up their sleeves and contribute in the early days and transition to growing in responsibility as we scale. This candidate will be comfortable diving into both structured and unstructured data, creating robust financial models and business cases, producing compelling visualizations and collateral, and leading the narrative on data-driven storytelling. About The Role We are looking for a Senior Manager -Advisory Services, a key role within the Advisory Services team at Innovaccer. This individual will be responsible for delivering key customer analytics (e.g. ROI models), performance analytics and slide presentations to support multiple client pursuits and engagements. The ideal candidate has a strong desire to learn about the US healthcare system, is organized and structured, has excellent written and verbal communication skills and is a fast learner. The role requires both analytical skills and creativity to articulate and communicate complex messages about healthcare and technology to a wide-ranging audience. You will be aligned with a Managing Director/Director in the US who will provide you direction on day to day work and help you learn about the company and the industry. A Day in the Life Under direction of Advisory Services leaders, engage with prospect organizations on intended business outcomes and request data assets to model potential scenarios Own, digest, and interpret data from a variety of forms, aggregated metrics in spreadsheets to unstructured formats to raw, transactional forms like medical claims Own and execute the entire analytics lifecycle, leveraging data in all its available forms to produce cogent and compelling business cases, financial models, presentations, and other executive-ready final deliverables Synthesize insights to inform strategic direction, roadmap creation, and opportunities Couple Innovaccer's technology platform-including data, software and workflow applications, analytics, and AI-with identified insights and opportunities to create prescriptive recommendations that maximize value creation and outcomes Develop findings and insights for senior leadership of prospects and clients and Innovaccer stakeholders in a clear and compelling manner Stay up-to-date with the latest analytics technologies and methodologies to enhance capabilities Build compelling presentations including client sales and engagement delivery decks, case studies, talk tracks, and visuals. Research and analyze high priority strategic clients, industry best practices and market intelligence, including industry mapping, customer profiling, competitive insights and deep dives into select solution opportunities Co-develop and maintain standardized value lever framework, segment-based pitch decks and customer case studies for use across multiple advisory pursuits and engagements Provide analytics thought partnership and data support on the design, execution, and measurement of impactful advisory services strategy initiatives Collaborate across Advisory Services, Growth Strategy, Marketing, Sales, Product, and Customer Success teams and business leaders to address business questions that can be answered effectively through data-driven modeling and insights Develop slide presentations for quarterly and annual reporting presentations Structure, manage, and write responses to RFPs What You Need Degree from a Tier 1 college with relevant degrees in Finance, Economics, Statistics, Business, or Marketing. 3-5 years of professional experience, including experience in management consulting and/or Go To Market in a technology/ software/SAAS company Strong technical aptitude, fantastic storytelling skills, with a great track record of working across sales, marketing, and technology teams Ability to identify, source, and include data elements to drive analytical models and outputs. Experience creating Excel models (identify inputs, key considerations/variables, relevant outputs) and PowerPoint presentations Familiarity with leveraging AI tools (e.g., generative AI, AI-enhanced research tools, AI-based data analysis platforms) to enhance productivity, accelerate research, generate insights, and support creative problem-solving Proactive, decisive, independent thinker and good at problem solving and conducting industry research Experience making slide presentations for internal and external audiences that articulate key takeaways Creative problem solver with the ability to back up ideas with requisite fact-based arguments Comfortable working with multiple data sources in both structured data and unstructured formats to frame a business opportunity and develop a structured path forward Strong proficiency in Excel and PowerPoint or G-Suite Willing to work in a fast-paced environment under tight deadlines Strong written and verbal communication skills, as well as the ability to manage cross-functional stakeholders Experience with analytics and financial modeling US Healthcare experience and/or a strong willingness and interest to learn this space. Specific areas of interest include: Understanding of payer/provider / patient dynamics Provider data strategy and architecture Provider advanced analytics, AI, NLP Patient experience and engagement Population Health and Care Management Utilization and cost management Risk and Quality Management Population Health Management Risk models Value-Based Care Social Determinants of Health We offer competitive benefits to set you up for success in and outside of work. Here's What We Offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days Parental Leave: Experience one of the industry's best parental leave policies to spend time with your new addition Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your child's well-being first. *India offices Where And How We Work Our Noida office is situated in a posh techspace, equipped with various amenities to support our work environment. Here, we follow a five-day work schedule, allowing us to efficiently carry out our tasks and collaborate effectively within our team. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Business Intelligence- Manager Location: Mumbai About Us StayVista is India’s largest villa hospitality brand and has redefined group getaways. Our handpicked luxury villas are present in every famous holiday destination across the country. We curate unique experiences paired with top-notch hospitality, creating unforgettable stays. Here, you will be a part of our passionate team, dedicated to crafting exceptional getaways and curating one-of-a-kind homes. We are a close-knit tribe, united by a shared love for travel and on a mission to become the most loved hospitality brand in India. Why Work With Us? At StayVista, you're part of a community where your ideas and growth matter. We’re a fast-growing team that values continuous improvement. With our skill upgrade programs, you’ll keep learning and evolving, just like we do. And hey, when you’re ready for a break, our villa discounts make it easy to enjoy the luxury you help create. Your Role As an Manager – Business Intelligence, you will lead data-driven decision-making by transforming complex datasets into strategic insights. You will optimize data pipelines, automate workflows, and integrate AI-powered solutions to enhance efficiency. Your expertise in database management, statistical analysis, and visualization will support business growth, while collaboration with leadership and cross-functional teams will drive impactful analytics strategies. About You 8+ years of experience in Business Intelligence, Revenue Management, or Data Analytics, with a strong ability to turn data into actionable insights. Bachelor’s or Master’s degree in Business Analytics, Data Science, Computer Science, or a related field. Skilled in designing, developing, and implementing end-to-end BI solutions to improve decision-making. Proficient in ETL processes using SQL, Python, and R, ensuring accurate and efficient data handling. Experienced in Google Looker Studio, Apache Superset, Power BI, and Tableau to create clear, real-time dashboards and reports. Develop, Document & Support ETL mappings, Database structures and BI reports. Develop ETL using tools such as Pentaho/Talend or as per project requirements. Participate in the UAT process and ensure quick resolution of any UAT issue or data issue. Manage different environments and be responsible for proper deployment of reports/ETLs in all client environments. Interact with Business and Product team to understand and finalize the functional requirements Responsible for timely deliverables and quality Skilled at analyzing industry trends and competitor data to develop effective pricing and revenue strategies. Demonstrated understanding of data warehouse concepts, ETL concepts, ETL loading strategy, data archiving, data reconciliation, ETL error handling, error logging mechanism, standards and best practices Cross-functional Collaboration Partner with Product, Marketing, Finance, and Operations to translate business requirements into analytical solutions. Key Metrics: what you will drive and achieve Data Driven Decision Making &Business Impact. Revenue Growth & Cost Optimization. Cross-Functional Collaboration & Leadership Impact BI & Analytics Efficiency and AI Automation Integration

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Experience :7-9 yrs Experience in AWS services must like S3, Lambda , Airflow, Glue, Athena, Lake formation ,Step functions etc. Experience in programming in JAVA and Python. Experience performing data analysis (NOT DATA SCIENCE) on AWS platforms Nice To Have Experience in a Big Data technologies (Terradata, Snowflake, Spark, Redshift, Kafka, etc.) Experience with data management process on AWS is a huge Plus Experience in implementing complex ETL transformations on AWS using Glue. Familiarity with relational database environment (Oracle, Teradata, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Where Data Does More. Join the Snowflake team. We are looking for people who have a strong background in data science and cloud architecture to join our AI/ML Workload Services team to create exciting new offerings and capabilities for our customers! This team within the Professional Services group will be working with customers using Snowflake to expand their use of the Data Cloud to bring data science pipelines from ideation to deployment, and beyond using Snowflake's features and its extensive partner ecosystem. The role will be highly technical and hands-on, where you will be designing solutions based on requirements and coordinating with customer teams, and where needed Systems Integrators. AS A SOLUTIONS ARCHITECT - AI/ML AT SNOWFLAKE, YOU WILL: Be a technical expert on all aspects of Snowflake in relation to the AI/ML workload Build, deploy and ML pipelines using Snowflake features and/or Snowflake ecosystem partner tools based on customer requirements Work hands-on where needed using SQL, Python, Java and/or Scala to build POCs that demonstrate implementation techniques and best practices on Snowflake technology within the Data Science workload Follow best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own Maintain deep understanding of competitive and complementary technologies and vendors within the AI/ML space, and how to position Snowflake in relation to them Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Provide guidance on how to resolve customer-specific technical challenges Support other members of the Professional Services team develop their expertise Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing OUR IDEAL SOLUTION ARCHITECT - AI/ML WILL HAVE: Minimum 10 years experience working with customers in a pre-sales or post-sales technical role Skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Thorough understanding of the complete Data Science life-cycle including feature engineering, model development, model deployment and model management. Strong understanding of MLOps, coupled with technologies and methodologies for deploying and monitoring models Experience and understanding of at least one public cloud platform (AWS, Azure or GCP) Experience with at least one Data Science tool such as AWS Sagemaker, AzureML, Dataiku, Datarobot, H2O, and Jupyter Notebooks Hands-on scripting experience with SQL and at least one of the following; Python, Java or Scala. Experience with libraries such as Pandas, PyTorch, TensorFlow, SciKit-Learn or similar University degree in computer science, engineering, mathematics or related fields, or equivalent experience BONUS POINTS FOR HAVING: Experience with Databricks/Apache Spark Experience implementing data pipelines using ETL tools Experience working in a Data Science role Proven success at enterprise software Vertical expertise in a core vertical such as FSI, Retail, Manufacturing etc Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Syniverse is the world’s most connected company. Whether we’re developing the technology that enables intelligent cars to safely react to traffic changes or freeing travelers to explore by keeping their devices online wherever they go, we believe in leading the world forward. Which is why we work with some of the world’s most recognized brands. Eight of the top 10 banks. Four of the top 5 global technology companies. Over 900 communications providers. And how we’re able to provide our incredible talent with an innovative culture and great benefits. Who We're Looking For The Data Engineer I is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems or building new solutions from ground up. This role will work with developers, architects, product managers and data analysts on data initiatives and ensure optimal data delivery with good performance and uptime metrics. Your behaviors align strongly with our values because ours do. Some Of What You'll Do Scope of the Role: Direct Reports: This is an individual contributor role with no direct reports Key Responsibilities Create, enhance, and maintain optimal data pipeline architecture and implementations. Analyze data sets to meet functional / non-functional business requirements. Identify, design, and implement data process: automating processes, optimizing data delivery, etc. Build infrastructure and tools to increase data ETL velocity. Work with data and analytics experts to implement and enhance analytic product features. Provide life cycle support the Operations team for existing products, services, and functionality assigned to the Data Engineering team. Experience, Education, And Certifications Bachelor’s degree in Computer Science, Statistics, Informatics or related field or equivalent work experience. Software Development experience desired Experience in Data Engineer fields is desired. Experience in building and optimizing big data pipelines, architectures, and data sets: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL databases, such as PostgreSQL, MySQL, etc. Experience with stream-processing systems: Flink, KSQL, Spark-Streaming, etc. Experience with programming languages, such as Java, Scala, Python, etc. Experience with cloud data engineering and development, such as AWS, etc. Additional Requirements Familiar with Agile software design processes and methodologies. Good analytic skills related to working with structured and unstructured datasets. Knowledge of message queuing, stream processing and scalable big data stores. Ownership/accountability for tasks/projects with on time and quality deliveries. Good verbal and written communication skills. Teamwork with independent design and development habits. Work with a sense of urgency and positive attitude. Why You Should Join Us Join us as we write a new chapter, guided by world-class leadership. Come be a part of an exciting and growing organization where we offer a competitive total compensation, flexible/remote work and with a leadership team committed to fostering an inclusive, collaborative, and transparent organizational culture. At Syniverse connectedness is at the core of our business. We believe diversity, equity, and inclusion among our employees is crucial to our success as a global company as we seek to recruit, develop, and retain the most talented people who want to help us connect the world. Know someone at Syniverse? Be sure to have them submit you as a referral prior to applying for this position.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Proxsoft Technologies LLC is a US-registered tech consultancy delivering cutting-edge solutions in Power BI, Power Platform, AI automation, and custom ERP reporting. We specialize in serving construction, infrastructure, and enterprise clients with deep expertise in systems like Viewpoint Vista, Spectrum, Procore, Acumatica, and Microsoft Dynamics. Role Overview: We are looking for a talented Power BI Developer to join our fast-growing team. You'll work closely with data engineers, ERP analysts, and business users to build interactive, insightful, and scalable dashboards that drive decisions for Fortune 500 clients and fast-scaling businesses. Key Responsibilities: Using best practices, design, develop, and deploy Power BI dashboards, paginated reports, and embedded analytics. Connect, model, and transform data from SQL Server, Excel, SharePoint, and cloud data sources. Collaborate with clients to gather business requirements and translate them into visualizations. Build optimized DAX measures, KPIs, bookmarks, drill-throughs, and dynamic visuals. Work on data modeling, relationship architecture, and performance tuning. Integrate Power BI with Power Automate workflows and Power Apps where needed. Document technical requirements, data dictionaries, and end-user guides. Required Skills: Strong in data modeling (star schema, snowflake), ETL, and relational data concepts. 2+ years of hands-on experience with Power BI Desktop, Power BI Service, and DAX. Proficiency in T-SQL, views, stored procedures, and performance optimization. Experience working with ERP datasets (Viewpoint, Acumatica, Procore, etc. is a huge plus). Understanding of row-level security (RLS) and workspace governance. Exposure to Power Automate, Power Apps, or SSRS / Paginated Reports is a bonus. Nice to Have: Familiarity with Azure Synapse, Dataflows, Power Query (M). Knowledge of embedding Power BI in web apps or portals. Microsoft certification in DA-100 / PL-300. Experience with construction / engineering clients or financial dashboards. What We Offer: Exposure to real enterprise-grade datasets and ERP integrations. Flexible work hours (client projects follow US time zones). Opportunity to work on cutting-edge projects using Power Platform + AI. Rapid career growth with direct mentorship from senior architects and CTO. Paid tools, learning access, and certifications.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training and development opportunities for team members to enhance their skills. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Strong understanding of cloud computing principles and architecture. - Experience with application lifecycle management and deployment strategies. - Familiarity with data integration and ETL processes. - Knowledge of security best practices in cloud environments. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Data Services. - This position is based at our Pune office. - A 15 years full time education is required.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also be responsible for maintaining communication with stakeholders to provide updates and gather feedback, ensuring that the applications meet the required specifications and quality standards. Your role will be pivotal in driving the success of the projects you oversee, fostering a collaborative environment, and mentoring team members to enhance their skills and performance. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training sessions to enhance team capabilities and knowledge sharing. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Strong understanding of cloud computing principles and architecture. - Experience with data integration and ETL processes. - Familiarity with application development frameworks and methodologies. - Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Data Services. - This position is based in Pune. - A 15 years full time education is required.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft SQL Server, Firewall, EPO Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the needs of stakeholders effectively. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft SQL Server. - Strong understanding of database design and management. - Experience with performance tuning and optimization of SQL queries. - Familiarity with data integration and ETL processes. - Ability to troubleshoot and resolve database-related issues. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft SQL Server. - This position is based in Pune. - A 15 years full time education is required.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

🔍 Job Title: Treasure Data Engineer (6+ Years Experience) 📍 Location: PAN India (Hybrid as per project needs) 🕒 Experience Required: 6+ Years 📢 We’re Hiring! We are looking for an experienced Treasure Data Engineer with 6+ years of relevant experience in building, maintaining, and optimizing large-scale customer data platforms (CDPs) using Treasure Data. This is an exciting opportunity to work with a leading organization on cutting-edge data engineering and marketing tech solutions. ✅ Required Skills: 6+ years of experience in data engineering, with at least 2+ years of hands-on experience with Treasure Data/CDPs Strong knowledge of SQL, Python/JavaScript, and data integration best practices Experience working with Treasure Workflow, Data Connectors, Segmentations, and Audience Building Experience integrating data from various sources like Salesforce, Google Analytics, Adobe, etc. Knowledge of ETL pipelines, data quality, and customer data activation Familiarity with cloud platforms (AWS/GCP) and marketing automation tools is a plus 🎯 Responsibilities: Design and implement workflows, pipelines, and data transformations in Treasure Data Collaborate with cross-functional teams to integrate customer data sources Optimize performance of existing workflows and queries Support end-users in audience building, data analysis, and campaign execution Ensure data accuracy, security, and compliance

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Primary Responsibility: Collect, clean, and analyze data from various sources. Assist in creating dashboards, reports, and visualizations. We are looking for a SQL Developer Intern to join our team remotely. As an intern, you will work with our database team to design, optimize, and maintain databases while gaining hands-on experience in SQL development. This is a great opportunity for someone eager to build a strong foundation in database management and data analysis. Responsibilities Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices. Requirements Currently pursuing or recently completed a degree in Computer Science, Information Technology, or a related field. Strong understanding of SQL and relational database concepts. Experience with databases such as MySQL, PostgreSQL, SQL Server, or Oracle. Ability to write efficient and optimized SQL queries. Basic knowledge of indexing, stored procedures, and triggers. Understanding of database normalization and design principles. Good analytical and problem-solving skills. Ability to work independently and in a team in a remote setting. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

0.0 - 5.0 years

0 - 0 Lacs

Chennai, Tamil Nadu

On-site

Job Description –ODI Developer Location : Equitas Office, Backside Vikatan Office, 757, Vasan Ave, Anna Salai, Thousand Lights, Chennai, Tamil Nadu 600002 Job Type: Full-Time Experience: 5+ years Job Summary: We are hiring a Lead Data Engineer to architect and lead enterprise data integration initiatives. This role requires deep technical expertise in data engineering and leadership experience. Familiarity with Oracle Data Integrator (ODI) is preferred, especially in environments using the Oracle stack. Key Responsibilities: Architect and oversee the implementation of scalable, reliable data pipelines. Define standards and best practices for data integration and ETL development. Lead a team of data engineers and mentor junior staff. Collaborate with stakeholders to understand business data needs and translate them into technical solutions. Ensure adherence to data governance, security, and compliance requirements. Requirements: 5+ years of experience in data engineering, including team leadership roles. Deep knowledge of ETL architecture and data integration frameworks. Experience with any ETL tool (ODI is mandatory). Strong SQL, data modeling, and performance tuning skills. Experience with cloud data platforms and modern data architectures. Excellent leadership, communication, and stakeholder management skills. Knowledge on real-time or near-real-time data streaming (e.g., Kafka). Job Type: Full-time Pay: ₹12,817.62 - ₹60,073.88 per month Benefits: Health insurance Provident Fund Experience: 5S: 5 years (Preferred) Location: Chennai, Tamil Nadu (Required) Work Location: In person

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies