Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
We are looking for a highly skilled and detail-oriented QA Engineer with expertise in Snowflake, API testing, and Automation utilizing Java-based automation frameworks. As a QA Engineer, you will have a vital role in ensuring the quality and reliability of our data-driven and cloud-based applications. Your responsibilities will include designing, developing, and executing test strategies for Snowflake data pipelines and integrations. You will be involved in end-to-end REST API testing using tools such as Postman, Rest Assured, or similar. Additionally, developing and maintaining automation test suites using Java and popular testing frameworks like TestNG, JUnit, Cucumber, and Selenium will be part of your routine. Collaboration with developers, product managers, and data engineers to comprehend requirements and deliver high-quality releases will be essential. You will be expected to implement and maintain CICD pipelines for automated testing using Jenkins/Harness, monitor, analyze, and report test results, and work with the team to troubleshoot and resolve issues. Ensuring test coverage for Functional, integration, regression, and performance testing will also be a key focus area. The ideal candidate should possess hands-on experience with Snowflake, including writing queries, validating data, and working with Snowflake integrations and pipelines. A strong background in API Testing and Automation is necessary, including an understanding of API architecture, Authentication, and error handling. Proficiency in UI & API Automation and experience with different frameworks like Selenium, Cucumber, TestNG, JUnit, etc., are highly desirable. Experience with Azure cloud services, such as Azure Data Validations, Storage Accounts, or relevant equivalents, is required. Working knowledge of SQL and data validation in cloud data services is expected, along with familiarity with version control systems like Git. Experience in Agile/Scrum Environments and using JIRA for test case management, Sprint planning, and defect tracking is crucial. Nice to have skills include exposure to performance testing tools like JMeter and knowledge of BDD framework.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
kolkata, west bengal
On-site
You will play a critical role in designing, testing, and maintaining software programs for operating systems or applications to be deployed at client end, ensuring they meet 100% quality assurance parameters. Your responsibilities include understanding requirements and designing the software, developing solutions, investigating problem areas throughout the software development life cycle, and facilitating root cause analysis of system issues. You will also collaborate with functional teams, project managers, and systems analysts to ensure the software meets client requirements. Coding plays a crucial part in your role, where you will determine operational feasibility, develop and automate processes for software validation, modify software to fix errors and improve performance, and prepare reports on programming project specifications. Error-free code, timely documentation, and coordination with the team are essential for successful project delivery. Your focus will also be on status reporting and customer feedback to ensure smooth and on-time delivery. Capturing client requirements, taking regular feedback, and providing timely responses to customer requests are key aspects of this role. Continuous education, training, and maintaining quality interactions with customers are vital for your success. Key performance parameters include Continuous Integration, Deployment & Monitoring of Software, Quality & CSAT, and MIS & Reporting. Mandatory skills for this role include experience with Snowflake and a minimum of 5-8 years of relevant experience. Join Wipro to be part of a modern digital transformation partner with bold ambitions. Embrace reinvention, continuous evolution, and empowerment to design your own career path. Your journey at Wipro will be powered by purpose, ambition, and inclusivity, welcoming applications from individuals with disabilities.,
Posted 6 days ago
10.0 - 14.0 years
0 Lacs
delhi
On-site
You are seeking an experienced Cloud & DevOps Architect with over 10 years of experience in architecting, designing, and implementing scalable solutions using AWS, Azure, and Snowflake. The ideal candidate will have strong expertise in CI/CD pipeline setup, quality gate configuration, and DevOps automation, with a strategic focus on enabling reliable, efficient delivery pipelines and supporting ongoing operations of existing products. As a Cloud & DevOps Architect, your main responsibilities will include architecting and designing scalable, secure, and high-performance cloud solutions across AWS, Azure, and Snowflake. You will lead the setup and optimization of CI/CD pipelines, including configuration of build and deployment processes and quality gates. Additionally, you will define and enforce DevOps best practices across engineering and delivery teams, collaborate with Client Service and Delivery Teams to ensure alignment of architecture with business goals, and support and enhance the deployment lifecycle of existing products. Key responsibilities of this role also include providing architectural oversight for troubleshooting, root cause analysis, and resolution of complex deployment issues, ensuring governance, compliance, and security standards are maintained throughout the DevOps lifecycle, and leveraging tools like JIRA to manage architecture tasks, progress, and collaboration in Agile environments. Joining Material, a renowned global digital engineering firm, means being part of an Awesome Tribe that solves complex technology problems using deep technology expertise and strategic partnerships with top-tier technology partners. In addition to fulfilling, high-impact work, Material offers a professional development and mentorship program, a hybrid work mode with a remote-friendly workplace, health and family insurance, 40 leaves per year along with maternity & paternity leaves, as well as wellness, meditation, and counseling sessions. If you are passionate about architecting scalable cloud solutions, optimizing CI/CD pipelines, and enforcing DevOps best practices in a dynamic and innovative environment, this opportunity at Material may be the perfect fit for you. Apply now and be part of a team that turns customer challenges into growth opportunities!,
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
Do you want to work on complex and pressing challenges, the kind that bring together curious, ambitious, and determined leaders who strive to become better every day If this sounds like you, you've come to the right place. You will be a core member of Periscope's technology team with responsibilities that range from developing and implementing our core enterprise products to ensuring that McKinsey's craft stays on the leading edge of technology. In this role, you will be involved in leading software development projects in a hands-on manner. You will spend about 70% of your time writing and reviewing code and creating software designs. Your expertise will expand into database design, core middle-tier modules, performance tuning, cloud technologies, DevOps, and continuous delivery domains over time. You will be an active learner, tinkering with new open-source libraries, using unfamiliar technologies without supervision, and learning frameworks and approaches. You will have a strong understanding of key agile engineering practices to guide teams on improvement opportunities in their engineering practices. You will provide ongoing coaching and mentoring to the developers to improve our organizational capability. You will be based in our Bengaluru or Gurugram office as part of our Growth, Marketing & Sales team. You'll be aligned primarily with Periscope's technology team. Periscope By McKinsey enables better commercial decisions by uncovering actionable insights. The Periscope platform combines world-leading intellectual property, prescriptive analytics, and cloud-based tools to provide more than 25 solutions focused on insights and marketing, with expert support and training. It is a unique combination that drives revenue growth both now and in the future. Customer experience, performance, pricing, category, and sales optimization are powered by the Periscope platform. Periscope has a presence in 26 locations across 16 countries with a team of 1000+ business and IT professionals and a network of 300+ experts. Driving lasting impact and building long-term capabilities with our clients is not easy work. You are the kind of person who thrives in a high-performance/high reward culture - doing hard things, picking yourself up when you stumble, and having the resilience to try another way forward. In return for your drive, determination, and curiosity, we'll provide the resources, mentorship, and opportunities you need to become a stronger leader faster than you ever thought possible. Your colleagues at all levels will invest deeply in your development, just as much as they invest in delivering exceptional results for clients. Every day, you'll receive apprenticeship, coaching, and exposure that will accelerate your growth in ways you won't find anywhere else. When you join us, you will have continuous learning opportunities, a voice that matters, be part of a global community, and receive world-class benefits. Your qualifications and skills should include a degree in computer science or a related field, 6+ years" experience in software development, proficiency in Scala, React.js, relational and NoSQL databases, cloud infrastructure, container technologies, modern engineering practices, Agile methodology, performance optimization tools, excellent analytical and problem-solving skills, customer service focus, and the ability to work effectively under pressure and in diverse team settings. Prior experience leading a small team is advantageous.,
Posted 6 days ago
2.0 - 6.0 years
0 Lacs
telangana
On-site
You will provide analytics support to Novartis internal customers (CPOs & Regional marketing and sales teams) on various low-medium complexity analytical reports. You will support and facilitate data-enabled decision-making for Novartis internal customers by providing and communicating qualitative and quantitative analytics. Additionally, you will support GBS - GCO business in building practice by involving in various initiatives like knowledge sharing, on-boarding and training support, supporting team lead in all business-related tasks/activities, building process documentation, and knowledge repositories. You will also be an integral part of a comprehensive design team responsible for designing promotional marketing materials. As an Analyst at Novartis, your key responsibilities will include creating and delivering Field Excellence insights as per agreed SLAs, designing, developing, and/or maintaining ETL based solutions that optimize field excellence activities, delivering services through an Agile project management approach, maintaining standard operating procedures (SOPs) and quality checklists, and developing and maintaining knowledge repositories collecting qualitative and quantitative data of field excellence related trends across Novartis operating markets. Essential requirements for this role include 2 years of experience in SQL and Excel, learning agility, the ability to manage multiple stakeholders, experience in Pharma datasets, and experience in Python or any other scripting language. Desirable requirements include a University/Advanced degree, ideally a Masters degree or equivalent experience in fields such as business administration, finance, computer science, or a technical field. Experience of at least 3 years in using ETL tools (Alteryx, DataIKU, Matillion, etc.) and hands-on experience with cloud-based platforms like SnowFlake is mandatory. Novartis's purpose is to reimagine medicine to improve and extend people's lives, with a vision to become the most valued and trusted medicines company in the world. By joining Novartis, you will be a part of a mission-driven organization where associates drive the company to reach its ambitions. If you are passionate about making a difference in patients" lives and want to be part of a community of smart and dedicated individuals, consider joining Novartis. For more information about benefits and rewards at Novartis, you can refer to the Novartis Life Handbook at https://www.novartis.com/careers/benefits-rewards. If you are interested in staying connected with Novartis and learning about future career opportunities, you can join the Novartis Network here: https://talentnetwork.novartis.com/network.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
We are seeking a Data Modeler with expertise in mortgage banking data to support a large-scale Data Modernization program. As a Data Modeler, your primary responsibilities will include designing and developing enterprise-grade data models such as 3NF, Dimensional, and Semantic models to cater to both analytics and operational use cases. You will collaborate closely with business and engineering teams to define data products that are aligned with specific business domains. Your role will involve translating complex mortgage banking concepts into scalable and extensible models that meet the requirements of the organization. It is crucial to ensure that the data models are in alignment with modern data architecture principles and are compatible with cloud platforms like Snowflake and DBT. Additionally, you will be expected to contribute to the creation of canonical models and reusable patterns for enterprise-wide use. To be successful in this role, you should possess the following qualifications: - A minimum of 5 years of experience in data modeling with a strong emphasis on mortgage or financial services. - Hands-on experience in developing 3NF, Dimensional, and Semantic models. - Profound understanding of data as a product and domain-driven design principles. - Familiarity with modern data ecosystems and tools such as Snowflake, DBT, and BI tools would be advantageous. - Excellent communication skills are essential to effectively collaborate with both business and technical teams. This position requires the candidate to work onsite in either Hyderabad or Ahmedabad.,
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As an experienced Data & Analytics Project Manager, you will play a crucial role in leading end-to-end execution of data and analytics projects. Your expertise in data integration, analytics, and cloud platforms such as AWS and Azure will be essential in ensuring seamless delivery. Collaborating with cross-functional teams, driving innovation, and optimizing data-driven decision-making will be key responsibilities in this role. Our projects utilize a variety of technologies including internal custom-built solutions, packaged software, ERP solutions, data warehouses, Software as a Service, cloud-based solutions, and BI tools. You will be responsible for leading project teams from initiation to close, delivering effective solutions that meet approved customer and business needs. Your accountability will lie in determining and delivering solutions within budget and schedule commitments while maintaining required quality and compliance standards. Your main focus will be on leading end-to-end project management for Data Engineering & Analytics initiatives. This will involve understanding and managing data pipeline development, DWH design, and BI reporting needs at a high level. Collaborating with technical teams on Snowflake-based solutions, ETL pipelines, and data modeling concepts will be crucial. Overseeing project timelines, risks, and dependencies using Agile/Scrum methodologies will ensure successful project delivery. Facilitating communication between stakeholders to ensure alignment on Data Engineering, Data Analytics, and Power BI initiatives will be a key aspect of your role. Your responsibilities will also include working with DevOps and engineering teams to streamline CI/CD pipelines and deployment processes. Supporting metadata management and data mesh concepts to ensure an efficient data ecosystem will be essential. Working closely with Data Engineers, BI Analysts, and Business Teams to define project scope, objectives, and success criteria will contribute to overall project success. Ensuring data governance, security, and compliance best practices are followed will be critical in maintaining data integrity and security. Key responsibilities will include overseeing the full lifecycle of data and analytics projects to ensure scope, quality, and timelines are met. Acting as the primary liaison with customers, architects, and internal teams to align on execution strategies will be crucial. Managing ETL pipelines, data warehousing, visualization tools (Tableau, Power BI), and cloud-based big data solutions will be part of your oversight. Identifying potential risks, scope changes, and mitigation strategies to ensure smooth project execution will be important. Guiding workstream leads, supporting PMO updates, and maintaining transparent communication with all stakeholders will be key in ensuring project success. Driving innovation and process enhancements in data engineering, BI, and analytics workflows will be essential for continuous improvement. To excel in this role, you should have at least 8 years of experience leading data and analytics projects. Strong expertise in data integration tools, ETL processes, and big data technologies is required. Hands-on experience with cloud platforms and visualization tools will be beneficial. Proven ability to mentor teams, manage stakeholders, and drive project success is crucial. Excellent communication skills, with the ability to engage both business and IT executives, are necessary. Possessing certifications such as PMP, Agile, or Data Analytics will be advantageous.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
The ideal candidate for the Big Data Engineer role should have 3-6 years of experience and be located in Hyderabad. You should possess strong skills in Spark, Python/Scala, AWS/Azure, Snowflake, Databricks, SQL Server/NoSQL. As a Big Data Engineer, your main responsibilities will include designing and implementing data pipelines for both batch and real-time data processing. You will need to optimize data storage solutions for efficiency and scalability, collaborate with analysts and business teams to meet data requirements, monitor data pipeline performance, and troubleshoot any issues that arise. It is crucial to ensure compliance with data security and privacy policies. The required skills for this role include proficiency in Python, SQL, and ETL frameworks, experience with big data tools such as Spark and Hadoop, a strong knowledge of cloud services and databases, as well as familiarity with data modeling and warehousing concepts.,
Posted 6 days ago
5.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
The Manager/Sr. Manager, Business Analysis & Delivery is responsible for overseeing the structure, execution, and success of the delivery organization encompassing Business Systems Analysts (BSAs), Project Analysts, and Delivery Managers. This role ensures seamless client-facing solution delivery by optimizing project execution, enhancing task clarity, upholding documentation standards, and aligning team efforts with business outcomes. This position partners closely with cross-functional leaders across technical services and client success to uphold delivery excellence, support workforce planning, and maintain operational consistency. The emphasis is on tactical leadership in project coordination, client engagement, and execution support, without direct oversight of development or QA functions. Team Leadership and Development Lead and mentor a high-performing team of BSAs, Project Analysts, and Delivery Managers, fostering a culture of strong execution and career growth. Define clear team roles, responsibilities, and performance expectations to ensure consistent success across delivery pods. Support hiring efforts, conduct interviews, and contribute to onboarding processes for new team members. Promote a collaborative and knowledge-sharing environment across global teams. Project & Delivery Oversight Monitor delivery execution across pods/accounts, ensuring visibility and coordination. Guide BSA and Analyst task execution to ensure clarity, timeliness, and quality of deliverables. Align Delivery Managers on sprint planning, issue management, and stakeholder communication. Track timelines, manage resource allocation, and report on delivery health metrics. BSA and Delivery Operations Management Implement and maintain frameworks for BSA deliverables, documentation consistency, and requirement traceability. Support Project Analysts in handling documentation, scope tracking, and handoff coordination. Ensure Delivery Managers maintain proper reporting structures, accountability models, and KPIs. Drive consistency through standard operating procedures and execution best practices. Process and Performance Optimization Monitor throughput, documentation standards, and cadence to identify trends and performance gaps. Standardize scalable processes and tools to increase delivery effectiveness across verticals. Address inefficiencies and resolve team collaboration gaps proactively. Partner with Client Services, Product, and Technical leadership to align capabilities and staffing with evolving demand. Strategic Execution and Leadership Collaborate with leadership to define strategic priorities, staffing plans, and performance goals. Provide accurate team availability and delivery capacity projections. Deliver presentations and reports showcasing team impact, delivery health, and process improvements. Champion delivery best practices to maintain trust and alignment with stakeholders. Qualifications Required: - 10+ years of experience in technical project or solution delivery, including leadership in client services or consulting. - 5+ years managing cross-functional roles such as BSAs, Delivery Managers, or Project Analysts. - Solid understanding of delivery governance, client collaboration, sprint methodologies, and team operations. - Strong communication skills with a track record of driving operational outcomes in matrixed environments. - Experience in martech, data integration, or platform-based delivery environments. Preferred: - Familiarity with tools and platforms such as Snowflake, SQL, Jira, and Confluence. - Exposure to cloud platforms (AWS, Azure) in client delivery use cases. - Experience with delivery analytics, reporting, and scalable systems. - Proven success in fostering cross-functional collaboration and delivery system evolution.,
Posted 6 days ago
2.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer at our company, you will play a crucial role in designing, developing, and optimizing data pipelines and workflows in a cloud-based environment. Your expertise in PySpark, Snowflake, and AWS will be key as you leverage these technologies for data processing and analytics. Your responsibilities will include designing and implementing scalable ETL pipelines using PySpark on AWS, developing and optimizing data workflows for Snowflake integration, and managing and configuring various AWS services such as S3, Lambda, Glue, EMR, and Redshift. Collaboration with data analysts and business teams to understand requirements and deliver solutions will be essential, along with ensuring data security and compliance with best practices in AWS and Snowflake environments. Monitoring and troubleshooting data pipelines and workflows for performance and reliability, as well as writing efficient, reusable, and maintainable code for data processing and transformation, will also be part of your role. To excel in this position, you should have strong experience with AWS services like S3, Lambda, Glue, and MSK, proficiency in PySpark for large-scale data processing, hands-on experience with Snowflake for data warehousing and analytics, and a solid understanding of SQL and database optimization techniques. Knowledge of data lake and data warehouse architectures, familiarity with CI/CD pipelines and version control systems like Git, as well as strong problem-solving and debugging skills are also required. Experience with Terraform or CloudFormation for infrastructure as code, knowledge of Python for scripting and automation, familiarity with Apache Airflow for workflow orchestration, and understanding of data governance and security best practices will be beneficial. Certification in AWS or Snowflake is a plus. You should hold a Bachelor's degree in Computer Science, Engineering, or a related field with 6 to 10 years of experience, including 5+ years of experience in AWS cloud engineering and 2+ years of experience with PySpark and Snowflake. Join us in our Technology team as a valuable member of the Digital Software Engineering job family, working full-time to contribute your most relevant skills while continuously growing and expanding your expertise.,
Posted 6 days ago
5.0 - 10.0 years
15 - 20 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Primary skills: Snowflake NP : immediate to 60 Days
Posted 6 days ago
5.0 - 10.0 years
20 - 30 Lacs
Bengaluru
Work from Office
Snowflake, Serverless, API, End to End data engineering, Azure, Lambda Python
Posted 6 days ago
5.0 - 9.0 years
5 - 12 Lacs
Chennai
Work from Office
We are seeking a skilled and motivated Data Engineer with proven experience in Data Vault 2.0, dbt (data build tool), and Snowflake to design, build, and optimize our enterprise data pipelines and data architecture. This role will focus on implementing scalable, auditable, and modular data models that support business intelligence, advanced analytics, and data governance initiatives. The ideal candidate is passionate about modern data engineering practices and enabling trusted data products in a cloud-native environment. Key Responsibilities: Design and implement data pipelines and data models using Data Vault 2.0 methodology in Snowflake. Build and maintain modular dbt models to transform raw data into curated layers (Raw Vault, Business Vault, Information Marts). Develop scalable, secure, and performant ELT/ETL solutions using tools like dbt, Fivetran, and/or custom Python code. Automate data integration workflows and implement CI/CD practices for data pipelines using dbt Cloud or similar tools. Collaborate with data architects, analysts, and domain experts to translate business needs into effective data models and pipelines. Ensure high data quality, lineage, and traceability using metadata and documentation best practices. Participate in data platform modernization efforts, including migration and optimization within Snowflake. Monitor and optimize the performance of data transformations and Snowflake compute/storage resources. Adhere to data governance standards, including security, access controls, and regulatory compliance.
Posted 6 days ago
6.0 - 11.0 years
15 - 22 Lacs
Bengaluru
Work from Office
Dear Candidate, Hope you are doing well. Greeting from NAM Info INC. NAM Info Inc. is a technology-forward talent management organization dedicated to bridging the gap between industry leaders and exceptional human resources. They pride themselves on delivering quality candidates, deep industry coverage, and knowledge-based training for consultants. Their commitment to long-term partnerships, rooted in ethical practices and trust, positions them as a preferred partner for many industries. Learn more about their vision, achievements, and services on their website at www.nam-it.com. We have an open position for Data Engineer role with our company for Bangalore, Pune and Mumbai location. Job Description Position: Sr / Lead Data Engineer Location: Bangalore, Pune and Mumbai Experience: 5 + years Required Skills: Azure, Data warehouse, Python, Spark, PySpark, Snowflake / Databricks, Any RDBMS, Any ETL Tool, SQL, Unix Scripting, GitHub Strong experience in Azure / AWS / GCP Permanent with NAM Info Pvt Ltd Work Location: Bangalore, Pune and Mumbai Working time: 12 PM to 9 PM or 2 PM to 11 PM 5 Days work from office, Monday to Friday L1 interview virtual, L2 face to face at Banashankari office (for Bangalore candidate) Notice period immediate to 15 days If you are fine with the above job details then please share your resume to ananya.das@nam-it.com Regards, Recruitment Team NAM Info INC
Posted 6 days ago
6.0 - 11.0 years
22 - 27 Lacs
Pune, Bengaluru
Work from Office
Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
In the modern banking age, financial institutions are required to bring Classical Data Drivers and Evolving Business Drivers together on a single platform. However, traditional data platforms face limitations in communicating with evolving business drivers due to technological constraints. A Modern Data Platform is essential to bridge this gap and elevate businesses to the next level through data-driven approaches, enabled by recent technology transformations. As a Technology leader with an academic background in Computer Science / Information Technology / Data Technologies [BE/BTech/MCA], you will have the opportunity to lead the Modern Data Platform Practice. This role involves providing solutions to customers on Traditional Datawarehouses across On-Prem and Cloud platforms. You will be responsible for architecting Data Platforms, defining Data engineering designs, selecting appropriate technologies and tools, and enhancing the organization's Modern Data Platform capabilities. Additionally, you will lead pre-sales discussions, provide technology architecture in RFP responses, and spearhead technology POC/MVP initiatives. To excel in this role, you are expected to possess the following qualifications and experiences: - 12-16 years of Data Engineering and analytics experience, including hands-on experience in Big Data systems across On-Prem and Cloud environments - Leadership in Data Platform architecture & design projects for mid to large size firms - Implementation experience with Batch Data and Streaming / Online data integrations using 3rd party tools and custom programs - Proficiency in SQL and one of the programming languages: Core Java / Scala / Python - Hands-on experience in Kafka for enabling Event-driven data pipes / processing - Knowledge of leading Data Services offered by AWS, Azure, Snowflake, Confluent - Strong understanding of distributed computing and related data structures - Implementation of Data Governance and Quality capabilities for Data Platforms - Analytical and presentation skills, along with the ability to build and lead teams - Exposure to leading RDBMS technologies and Data Visualization platforms - Demonstrated AI/ML models for Data Processing and generating Insights - Team player with the ability to work independently with minimal direction Your responsibilities at Oracle will be at Career Level - IC4, and the company values Diversity and Inclusion to foster innovation and excellence. Oracle offers a competitive suite of Employee Benefits emphasizing parity, consistency, and affordability, including Medical, Life Insurance, and Retirement Planning. The company encourages employees to contribute to the communities where they live and work. Oracle believes that innovation stems from diversity and inclusion, and is committed to creating a workforce where all individuals can thrive and contribute their best work. The company supports individuals with disabilities by providing reasonable accommodations throughout the job application, interview process, and in potential roles to ensure successful participation in crucial job functions. As a global leader in cloud solutions, Oracle is dedicated to leveraging tomorrow's technology to address today's challenges. The company values inclusivity and empowers its workforce to drive innovation and growth. Oracle careers offer opportunities for global engagement, work-life balance, and competitive benefits. The company is committed to promoting an inclusive workforce that supports opportunities for all individuals. If you require accessibility assistance or accommodation for a disability at any point during the employment process at Oracle, kindly reach out by emailing accommodation-request_mb@oracle.com or calling +1 888 404 2494 in the United States.,
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
pune, maharashtra
On-site
PharmaACE is a growing Global Healthcare Consulting Firm, headquartered in Princeton, New Jersey. Our expert teams of Business Analysts, based across the US, Canada, Europe, and India, provide Analytics and Business Solutions using our worldwide delivery models for a wide range of clients. Our clients include established, multinational BioPharma leaders and innovators, as well as entrepreneurial firms on the cutting edge of science. We have deep expertise in Forecasting, Business Analytics, Competitive Intelligence, Sales Analytics, and the Analytics Centre of Excellence Model. Our wealth of therapeutic area experience cuts across Oncology, Immuno- science, CNS, CV-Met, and Rare Diseases. We support our clients" needs in Primary Care, Specialty Care, and Hospital business units, and we have managed portfolios in the Biologics space, Branded Pharmaceuticals, Generics, APIs, Diagnostics, and Packaging & Delivery Systems. Role Summary: This is a client-facing leadership role responsible for managing strategic relationships, designing and delivering advanced analytics and AI/GenAI solutions, and driving measurable impact across global life sciences organizations. The ideal candidate will blend deep pharma domain knowledge, modern AI/ML expertise, and strong consulting acumen to shape and deliver differentiated solutions to commercial and brand teams at top pharmaceutical and biotech companies. Client Engagement & Advisory (40%): - Act as a strategic advisor to brand, commercial, and analytics leaders in global pharma clients. - Lead consultative discussions on AI/analytics roadmaps, omnichannel strategy, forecasting, and performance management. - Present insights, influence stakeholders, and translate analytics into compelling narratives that drive decisions. AI-Enhanced Analytics Delivery (30%): - Oversee delivery of AI-enabled solutions across domains like: - Dynamic segmentation & targeting - Next-best-action engines - Predictive HCP behavior models - LLM-based insight generation (GenAI) - Brand performance diagnostics - Ensure robust quality, timeliness, and strategic relevance of deliverables. Team & Capability Development (20%): - Lead and grow a high-performing team of consultants, data scientists, and domain experts. - Define and scale reusable AI/GenAI assets, frameworks, and accelerators. - Partner with internal technology, data, and innovation teams to align on platform evolution. Practice Growth & Innovation (10%): - Contribute to RFPs, proposals, and solutioning for new business opportunities. - Develop PoVs, whitepapers, and use cases to position the firm as a leader in commercial AI/GenAI. - Continuously scan the market for new pharma AI applications, platforms, and data partnerships. Required Qualifications: - Masters degree or higher in Data Science, Life Sciences, Business Analytics, or related field. - 1015 years of experience in commercial pharma analytics, including at least 35 years in a client-facing vendor/consulting role. - Strong understanding of commercial pharma data: IQVIA (NRx/TRx), patient claims, CRM, HCP engagement, formulary/market access data. - Experience delivering AI/ML/GenAI-driven solutions in real-world commercial use cases. - Proven ability to engage with senior pharma stakeholders (Directors, VPs, GMs) and lead large programs. Preferred Skills: - Familiarity with GenAI tools: OpenAI APIs, Claude, LLM summarization tools, Insight copilots. - Experience with tools like Python/R, SQL, Tableau/Power BI, Databricks/Snowflake. - Hands-on understanding of pharma GTM processes (launch excellence, sales ops, brand planning). Core Competencies: - Strategic thinking and business storytelling - Strong client presence and relationship building - AI fluency ability to evangelize and translate tech to value - Project management and operational excellence - Thought leadership and innovation mindset,
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have a total of 8+ years of development/design experience, with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. Proficiency in Snowflake and strong SQL programming skills are required. In addition, you should have strong experience with data modeling and schema design, as well as extensive experience using Data warehousing tools like Snowflake, BigQuery, or RedShift. Experience with BI Tools like Tableau, QuickSight, or PowerBI is a must, with at least one tool being a requirement. You should also have strong experience implementing ETL/ELT processes and building data pipelines, including workflow management, job scheduling, and monitoring. A good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, and Data Catalog is essential. Moreover, you should have a strong understanding of cloud services (AWS or Azure), including IAM and log analytics. Excellent interpersonal and teamwork skills are necessary for this role, as well as experience with leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. As part of the job responsibilities, you will be expected to perform the same tasks as mentioned above. At GlobalLogic, we prioritize a culture of caring. From day one, you will experience an inclusive culture of acceptance and belonging, where you will have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development. You will learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you will have the chance to work on projects that matter and engage your curiosity and creative problem-solving skills. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. GlobalLogic is a high-trust organization where integrity is key. By joining us, you are placing your trust in a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we have been at the forefront of the digital revolution, helping create innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are invited to join our team as a Mid-Level Data Engineer Technical Consultant with 4+ years of experience. As a part of our diverse and inclusive organization, you will be based in Bangalore, KA, working full-time in a permanent position during the general shift from Monday to Friday. In this role, you will be expected to possess strong written and oral communication skills, particularly in email correspondence. Your experience in working with Application Development teams will be invaluable, along with your ability to analyze and solve problems effectively. Proficiency in Microsoft tools such as Outlook, Excel, and Word is essential for this position. As a Data Engineer Technical Consultant, you must have at least 4 years of hands-on experience in development. Your expertise should include working with Snowflake and Pyspark, writing SQL queries, utilizing Airflow, and developing in Python. Experience with DBT and integration programs will be advantageous, as well as familiarity with Excel for data analysis and Unix Scripting language. Your responsibilities will encompass a good understanding of data warehousing and practical work experience in this field. You will be accountable for various tasks including understanding requirements, coding, unit testing, integration testing, performance testing, UAT, and Hypercare Support. Collaboration with cross-functional teams across different geographies will be a key aspect of this role. If you are action-oriented, independent, and possess the required technical skills, we encourage you to submit your resume to pallavi@she-jobs.com and explore this exciting opportunity further.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
You are a Data Engineer with 3+ years of experience, proficient in SQL and Python development. You will be responsible for designing, developing, and maintaining scalable data pipelines to support ETL processes using tools like Apache Airflow, AWS Glue, or similar. Your role involves optimizing and managing relational and NoSQL databases such as MySQL, PostgreSQL, MongoDB, or Cassandra for high performance and scalability. You will write advanced SQL queries, stored procedures, and functions to efficiently extract, transform, and analyze large datasets. Additionally, you will implement and manage data solutions on cloud platforms like AWS, Azure, or Google Cloud, utilizing services such as Redshift, BigQuery, or Snowflake. Your contributions to designing and maintaining data warehouses and data lakes will support analytics and BI requirements. Automation of data processing tasks through script and application development in Python or other programming languages is also part of your responsibilities. As a Data Engineer, you will implement data quality checks, monitoring, and governance policies to ensure data accuracy, consistency, and security. Collaboration with data scientists, analysts, and business stakeholders to understand data needs and translate them into technical solutions is essential. Identifying and resolving performance bottlenecks in data systems, optimizing data storage, and retrieval are key aspects. Maintaining comprehensive documentation for data processes, pipelines, and infrastructure is crucial. Staying up-to-date with the latest trends in data engineering, big data technologies, and cloud services is expected from you. You should hold a Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field. Proficiency in SQL, relational databases, NoSQL databases, Python programming, and experience with data pipeline tools and cloud platforms is required. Knowledge of big data tools like Apache Spark, Hadoop, or Kafka is a plus. Strong analytical and problem-solving skills with a focus on performance optimization and scalability are essential. Excellent verbal and written communication skills are necessary to convey technical concepts to non-technical stakeholders. You should be able to work collaboratively in cross-functional teams. Preferred certifications include AWS Certified Data Analytics, Google Professional Data Engineer, or similar. An eagerness to learn new technologies and adapt quickly in a fast-paced environment is a mindset that will be valuable in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
kolkata, west bengal
On-site
We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic data team. The ideal candidate will have deep expertise in Snowflake, dbt (Data Build Tool), and Python, with a strong understanding of data architecture, transformation pipelines, and data quality principles. You will play a crucial role in building and maintaining scalable data pipelines and facilitating data-driven decision-making across the organization. Your responsibilities will include designing, developing, and maintaining scalable and efficient ETL/ELT pipelines using dbt, Snowflake, and Python. You will be tasked with optimizing data models and warehouse performance in Snowflake, collaborating with data analysts, scientists, and business teams to understand data requirements and deliver high-quality datasets. Ensuring data quality, governance, and compliance across pipelines, automating data workflows, and monitoring production jobs for accuracy and reliability will be key aspects of your role. Additionally, you will participate in architectural decisions, promote best practices in data engineering, maintain documentation of data pipelines, transformations, and data models, mentor junior engineers, and contribute to team knowledge sharing. The ideal candidate should have at least 5 years of professional experience in Data Engineering, strong hands-on experience with Snowflake (data modeling, performance tuning, security features), proven experience using dbt for data transformation and modeling, proficiency in Python for data engineering tasks and scripting, a solid understanding of SQL, and experience in building and maintaining complex queries. Experience with orchestration tools like Airflow or Prefect, familiarity with version control systems like Git, strong problem-solving skills, attention to detail, excellent communication, and teamwork abilities are required. Preferred qualifications include experience working with cloud platforms such as AWS, Azure, or GCP, knowledge of data lake architecture and real-time streaming technologies, exposure to CI/CD pipelines for data deployment, and experience in agile development methodologies. Join us and be part of a team that values expertise, innovation, and collaboration in driving impactful data solutions across the organization.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
punjab
On-site
You are an experienced and results-driven ETL & DWH Engineer/Data Analyst with over 8 years of expertise in data integration, warehousing, and analytics. Your role involves having deep technical knowledge in ETL tools, strong data modeling skills, and the capability to lead intricate data engineering projects from inception to implementation. Your key skills include: - Utilizing ETL tools such as SSIS, Informatica, DataStage, or Talend for more than 4 years. - Proficiency in relational databases like SQL Server and MySQL. - Comprehensive understanding of Data Mart/EDW methodologies. - Designing star schemas, snowflake schemas, fact and dimension tables. - Experience with Snowflake or BigQuery. - Familiarity with reporting and analytics tools like Tableau and Power BI. - Proficient in scripting and programming using Python. - Knowledge of cloud platforms like AWS or Azure. - Leading recruitment, estimation, and project execution. - Exposure to Sales and Marketing data domains. - Working with cross-functional and geographically distributed teams. - Translating complex data issues into actionable insights. - Strong communication and client management abilities. - Initiative-driven with a collaborative approach and problem-solving mindset. Your roles & responsibilities will include: - Creating high-level and low-level design documents for middleware and ETL architecture. - Designing and reviewing data integration components while ensuring compliance with standards and best practices. - Ensuring delivery quality and timeliness for one or more complex projects. - Providing functional and non-functional assessments for global data implementations. - Offering technical guidance and support to junior team members for problem-solving. - Leading QA processes for deliverables and validating progress against project timelines. - Managing issue escalation, status tracking, and continuous improvement initiatives. - Supporting planning, estimation, and resourcing for data engineering efforts.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
gandhinagar, gujarat
On-site
As a Data Analyst specializing in Tableau and Snowflake, you will be responsible for creating and maintaining interactive Tableau dashboards and reports for key business stakeholders. Your role will involve developing, optimizing, and managing Snowflake data warehouse solutions to support analytics and reporting needs. Collaboration with data analysts, business users, and development teams will be essential to gather requirements and deliver effective data solutions. Your expertise in writing and maintaining complex SQL queries for data extraction, transformation, and analysis will be crucial in ensuring data accuracy, quality, and performance across all reporting and visualization platforms. Applying data governance and security best practices to safeguard sensitive information will also be part of your responsibilities. Active participation in Agile development processes, including sprint planning, daily stand-ups, and reviews, will be expected from you to contribute effectively to the team's goals. To excel in this role, you are required to have a minimum of 2-4 years of hands-on experience with Snowflake cloud data warehouse and Tableau for dashboard creation and report publishing. Strong proficiency in SQL for data querying, transformation, and analysis is essential, along with a solid understanding of data modeling, warehousing concepts, and performance tuning. Knowledge of data governance, security, and compliance standards, along with a bachelor's degree in Computer Science, Information Technology, or a related field, is necessary for this position. Experience with cloud platforms such as AWS, Azure, or GCP, a basic understanding of Python or other scripting languages, and familiarity with Agile/Scrum methodologies and development practices will be advantageous. This position of Data Analyst (Tableau and Snowflake) is based in Gandhinagar, with a flexible schedule and shift timings of either 3:30 PM - 12:30 AM or 4:30 PM - 1:30 AM, as per business needs.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
Experience: You have 3 to 5 years of experience in data engineering and analytics, and you are ready to experiment, learn, and implement in a new adventure. Your expertise in data engineering and analytics can revolutionize the dynamics of our organization and bring about a paradigm shift. We are eagerly waiting for you to join us, as we believe in selection over rejection. We are your destination: OptiSol offers you a stress-free and balanced lifestyle, a home away from home where your career is nurtured. As a GREAT PLACE TO WORK certified organization for 4 consecutive years, we value our culture, open communication, and accessible leadership. We celebrate diversity, promote work-life balance with flexible policies, and provide an environment where you can thrive both personally and professionally. We are the face of the future of AI and innovation, ready to live and learn together. What we like to see in you: Your core competencies include expertise in Data Engineering, Cloud services, Pre-Sales, and Client Handling. You excel in handling, transforming, and optimizing data pipelines for seamless workflows, with a focus on making data work smarter, not harder. Additionally, you have experience in cloud services such as AWS, Azure, or GCP to develop scalable, efficient, and secure data solutions that drive business growth. You are adept at bridging the gap between technical capabilities and business needs, crafting compelling solutions that add real value to clients. What do we expect: We expect you to have 3 to 5 years of data engineering experience, preferably with presales or client-facing exposure. Proficiency in Python and SQL, along with hands-on experience in Snowflake or Databricks, is required. You should be comfortable working with cloud platforms like AWS, Azure, or GCP to build scalable solutions and have familiarity with AI/ML frameworks. Strong knowledge of data engineering, warehousing, and analytics is essential, along with excellent communication skills and the ability to manage clients and stakeholders effectively. Possessing cloud or data engineering certifications would be a bonus. What You'll Bring to the Table: In this role, you will collaborate with sales and business teams to understand client needs and deliver tailored technical solutions. You will conduct demos and proof of concepts to showcase the impact of services and solutions, craft high-quality technical proposals, and provide expert guidance on data strategies, analytics, and AI solutions to clients. Your responsibilities will also include designing and optimizing data pipelines, staying updated on industry trends, and collaborating with product teams to refine offerings based on client needs and market shifts. Core benefits you'll gain: Working at OptiSol will allow you to stay ahead in the field by engaging with the latest in AI, cloud computing, and data engineering. You will gain hands-on experience in presales, consulting, and stakeholder management, lead workshops, contribute to industry discussions, and build expertise. By collaborating with sales, business, and product teams, you will design impactful solutions, take ownership of projects, and build strong relationships with key stakeholders. Apply now and join us at OptiSol to embark on this exciting journey together!,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are a Sr. Data Engineer with over 7 years of experience, specializing in Data Engineering, Python, and SQL. You will be a part of the Data Engineering team in the Enterprise Data Insights organization, responsible for building data solutions, designing ETL/ELT processes, and managing the data platform to support various stakeholders across the organization. Your role is crucial in driving technology and data-led solutions to foster growth and innovation at scale. Your responsibilities as a Senior Data Engineer include collaborating with cross-functional stakeholders to prioritize requests, identify areas for improvement, and provide recommendations. You will lead the analysis, design, and implementation of data solutions, including constructing data models and ETL processes. Furthermore, you will engage in fostering collaboration with corporate engineering, product teams, and other engineering groups, while also leading and mentoring engineering discussions and advocating for best practices. To excel in this role, you should possess a degree in Computer Science or a related technical field and have a proven track record of over 5 years in Data Engineering. Your expertise should include designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment, and developing data products and APIs. Proficiency in SQL/NoSQL databases, particularly Snowflake, Redshift, or MongoDB, along with strong programming skills in Python, is essential. Additionally, experience with columnar OLAP databases, data modeling, and tools like dbt, AirFlow, Fivetran, GitHub, and Tableau reporting will be beneficial. Good communication and interpersonal skills are crucial for effectively collaborating with business stakeholders and translating requirements into actionable insights. An added advantage would be a good understanding of Salesforce & Netsuite systems, experience in SAAS environments, designing and deploying ML models, and familiarity with events and streaming data. Join us in driving data-driven solutions and experiences to shape the future of technology and innovation.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough