Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5 - 8 years
12 - 17 Lacs
Bengaluru
Work from Office
The Core AI & Data Platforms Team has been established to create, operate, and run the Enterprise AI, BI & Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. Join us and be part of a team committed to excellence, innovation, and the continuous improvement of our data visualization capabilities. We look forward to your contributions towards our strategic goals and your proactive approach to embracing emerging technologies in the field of data analytics. About the Role: In this opportunity as a Senior Software Engineer, you will: Administer and support the Tableau platform, ensuring its stability, reliability, and performance. Collaborate with business teams and data analysts to enhance reporting and visualization capabilities. Implement and manage security protocols for the Tableau platform. Monitor system performance, conduct regular system audits, and provide performance tuning. Manage user access and maintain documentation for system configurations, processes, and service records. Facilitate the migration and integration of data sources with Tableau to ensure seamless data flow and accurate reporting. Support production and non-production build activities.Support ad-hoc activities and execute assigned tasks.Troubleshoot Tableau server incidents related to upgrades, patches, data source connectivity and access. Drive the adoption of best practices in data visualization and reporting within the organization. Stay abreast of the latest industry trends and updates in Tableau solutions. About You: You're a fit for the role of Senior Software Engineer if your background includes: Candidate should have 5 to 8 years of experience as a Tableau Administrator. Ability to analyze application and server logs, error interpretation. Excellent written and verbal communication skills and strong collaboration skills. Knowledge of SQL and experience with Python scripting. Familiarity with AWS services and Snowflake is a plus. Certification in Tableau Server Administration. A growth mindset with a willingness to adapt and continuously learn. Excellent collaboration skills to work with various stakeholders and team members. Ability to effectively articulate complex problems and solutions to diverse teams. Proven track record of driving improvements and maintaining high standards in data management and reporting.
Posted 1 month ago
5 - 7 years
5 - 18 Lacs
Bengaluru
Work from Office
Roles and Responsibilities : Design, develop, test, deploy and maintain large-scale data pipelines using Snowflake as the primary database engine. Collaborate with cross-functional teams to gather requirements and design solutions that meet business needs. Develop complex SQL queries to extract insights from massive datasets stored in Snowflake tables. Troubleshoot issues related to data quality, performance tuning, and security. Job Requirements : 5-7 years of experience in a similar role as an Azure Data Engineer or equivalent experience working with AWS services. Strong proficiency in Python programming language for scripting tasks such as ETL processes. Experience working with Snowflake as a cloud-based relational database management system (DBMS). Knowledge of best practices for designing scalable architectures for big-data workloads.
Posted 1 month ago
5 - 10 years
13 - 23 Lacs
Bengaluru, Delhi / NCR, Mumbai (All Areas)
Work from Office
Role & responsibilities Urgent Hiring for one of the reputed MNC Exp - 5 or 5+ Years Immediate Joiners only Location - Pan India except Hyderabad Mandate : Snowflake Development + Snowflake Administration Snowflake DBA
Posted 1 month ago
6 - 10 years
8 - 18 Lacs
Kolhapur, Hyderabad, Chennai
Work from Office
Relevant Exp:5+ Yrs Mandatory Skills: Snowflake architecture, Matillion, SQL, Python, SnowSQL, any cloud Exp Night shift (6 PM to 3 AM) Complete WFO - 5 Days Email Id: anusha@akshayaitsolutions.com Loc: Hyd/ Ban/Chennai/Kolhapur
Posted 1 month ago
9 - 14 years
19 - 32 Lacs
Gurugram
Remote
ONLY Immediate Joiners Requirement : Data Architect & Business Intelligence Experience: 9+ Years Location: Remote Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills.
Posted 1 month ago
8 - 13 years
25 - 35 Lacs
Gurugram
Hybrid
Requirement : Senior Business Analyst (Data Application & Integration) Experience: 8+ Years Location: Gurgaon (Hybrid) Job Summary: We are seeking an experienced Senior Business Analyst (Data Application & Integration) to drive key data and integration initiatives. The ideal candidate will have a strong business analysis background and a deep understanding of data applications, API integrations, and cloud-based platforms like Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Gather, document, and analyze business requirements for data application and integration projects. Work closely with business stakeholders to translate business needs into technical solutions. Design and oversee API integrations to ensure seamless data flow across platforms. Collaborate with cross-functional teams including developers, data engineers, and architects. Define and maintain data integration strategies, ensuring high availability and security. Work on Salesforce, Informatica, and Snowflake to streamline data management and analytics. Develop use cases, process flows, and documentation to support business and technical teams. Ensure compliance with data governance and security best practices. Act as a liaison between business teams and technical teams, providing insights and recommendations. Key Skills & Requirements: Strong expertise in business analysis methodologies and data-driven decision-making. Hands-on experience with API integration and data application management. Proficiency in Salesforce, Informatica, DBT, IICS, and Snowflake . Strong analytical and problem-solving skills. Ability to work in an Agile environment and collaborate with multi-functional teams. Excellent communication and stakeholder management skills.
Posted 1 month ago
12 - 15 years
25 - 40 Lacs
Bengaluru
Work from Office
Required skillset - Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: Detailed JD: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/Snowflake . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Role & responsibilities Preferred candidate profile
Posted 1 month ago
3 - 7 years
3 - 7 Lacs
Bengaluru
Hybrid
Hello everyone , we are hiring for Specialist Software Engineer Bigdata -Snowflake (snowpark ) & scala, python, linux 3 to 7years if any one are interested please share your cv to tjagadishwarachari@primusglobal.com Thanks & Regards, Thanu shree j Associate -TA PRIMUS Global Technologies Pvt. Ltd.
Posted 1 month ago
3 - 6 years
4 - 6 Lacs
Pune
Work from Office
Job Description: We are seeking a highly motivated and skilled Data Engineer with a strong background in building and maintaining data pipelines. The ideal candidate will have experience working with Python , ETL processes , SQL , and Databricks to support our clients data infrastructure and analytics needs. Key Responsibilities: Design, build, and maintain scalable and efficient ETL pipelines Develop robust data workflows using Databricks and cloud-based data platforms Write complex SQL queries for data extraction, transformation, and reporting Work with structured and semi-structured data, ensuring data quality and integrity Collaborate with data analysts, scientists, and other engineering teams to deliver high-quality data solutions Optimize data processes for performance, reliability, and scalability Required Skills: Minimum 3 years of experience in Data Engineering Proficiency in Python for data processing and automation Strong hands-on experience in ETL development Advanced SQL skills for querying and managing relational databases Experience working with Databricks (Spark/PySpark) for big data processing Familiarity with version control tools like Git Experience with cloud platforms (e.g., AWS , Azure , or GCP ) is a plus Preferred Skills: Knowledge of data modeling , data warehousing , and performance tuning Exposure to tools like Airflow , Kafka , or other orchestration frameworks Understanding of data governance and security best practices
Posted 1 month ago
5 - 10 years
9 - 19 Lacs
Bangalore Rural, Bengaluru
Work from Office
Job Summary: We are seeking an experienced Data Engineer with expertise in Snowflake and PLSQL to design, develop, and optimize scalable data solutions. The ideal candidate will be responsible for building robust data pipelines, managing integrations, and ensuring efficient data processing within the Snowflake environment. This role requires a strong background in SQL, data modeling, and ETL processes, along with the ability to troubleshoot performance issues and collaborate with cross-functional teams. Responsibilities: Design, develop, and maintain data pipelines in Snowflake to support business analytics and reporting. Write optimized PLSQL queries, stored procedures, and scripts for efficient data processing and transformation. Integrate and manage data from various structured and unstructured sources into the Snowflake data platform. Optimize Snowflake performance by tuning queries, managing workloads, and implementing best practices. Collaborate with data architects, analysts, and business teams to develop scalable and high-performing data solutions. Ensure data security, integrity, and governance while handling large-scale datasets. Automate and streamline ETL/ELT workflows for improved efficiency and data consistency. Monitor, troubleshoot, and resolve data quality issues, performance bottlenecks, and system failures. Stay updated on Snowflake advancements, best practices, and industry trends to enhance data engineering capabilities. Required Skills: Bachelors degree in Engineering, Computer Science, Information Technology, or a related field. Strong experience in Snowflake, including designing, implementing, and optimizing Snowflake-based solutions. Hands-on expertise in PLSQL, including writing and optimizing complex queries, stored procedures, and functions. Proven ability to work with large datasets, data warehousing concepts, and cloud-based data management. Proficiency in SQL, data modeling, and database performance tuning. Experience with ETL/ELT processes and integrating data from multiple sources. Familiarity with cloud platforms such as AWS, Azure, or GCP is an added advantage. Snowflake certifications (e.g., SnowPro Core, SnowPro Advanced) are a plus. Strong analytical skills, problem-solving abilities, and attention to detail.
Posted 1 month ago
2 - 7 years
4 - 9 Lacs
Hyderabad
Work from Office
Diverse Lynx is looking for Snowflake Professional to join our dynamic team and embark on a rewarding career journey. Design and Development : Create and implement data warehouse solutions using Snowflake, including data modeling, schema design, and ETL (Extract, Transform, Load) processes. Performance Optimization : Optimize queries, performance - tune databases, and ensure efficient use of Snowflake resources for faster data retrieval and processing. Data Integration : Integrate data from various sources, ensuring compatibility, consistency, and accuracy. Security and Compliance : Implement security measures and ensure compliance with data governance and regulatory requirements, including access control and data encryption. Monitoring and Maintenance : Monitor system performance, troubleshoot issues, and perform routine maintenance tasks to ensure system health and reliability. Collaboration : Collaborate with other teams, such as data engineers, analysts, and business stakeholders, to understand requirements and deliver effective data solutions. Skills and Qualifications : Snowflake Expertise : In - depth knowledge and hands - on experience working with Snowflake's architecture, features, and functionalities. SQL and Database Skills : Proficiency in SQL querying and database management, with a strong understanding of relational databases and data warehousing concepts. Data Modeling : Experience in designing and implementing effective data models for optimal performance and scalability. ETL Tools and Processes : Familiarity with ETL tools and processes to extract, transform, and load data into Snowflake. Performance Tuning : Ability to identify and resolve performance bottlenecks, optimize queries, and improve overall system performance. Data Security and Compliance : Understanding of data security best practices, encryption methods, and compliance standards (such as GDPR, HIPAA, etc. ). Problem - Solving and Troubleshooting : Strong analytical and problem - solving skills to diagnose and resolve issues within the Snowflake environment. Communication and Collaboration : Good communication skills to interact with cross - functional teams and effectively translate business requirements into technical solutions. Scripting and Automation : Knowledge of scripting languages (like Python) and experience in automating processes within Snowflake.
Posted 1 month ago
7 - 10 years
15 - 20 Lacs
Hyderabad, Pune, Chennai
Work from Office
Hiring For Top IT Company- Designation: ETL Tester Skills: ETL Testing + Data warehouse + Snowflakes + Azure Location :Bang/Hyd/Pune/Chennai Exp: 5-10 yrs Best CTC Call: Nikita:9549352329 Sonica:9460934330 Tara:6377522517 Thanks, Team Converse
Posted 1 month ago
6 - 10 years
8 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Senior Software Engineer with a strong background in cloud-based ERP systems to join our dynamic technology team. The ideal candidate will bring a solid mix of technical and functional ERP expertise. A strong command of SQL/PLSQL, experience with enterprise databases (e.g., Snowflake, Oracle, Azure, PostgreSQL), and the ability to troubleshoot and enhance performance are essential for success in this position. Responsibilities Design, develop, and maintain ERP-related solutions and customizations on platforms like Oracle Cloud ERP, EBS, Workday, SAP S4/Hana or similar. Work closely with functional teams to understand business requirements and translate them into technical specifications and solutions. Serve as a bridge between functional stakeholders and technical teams, offering insight into ERP system capabilities and configurations. Write and optimize complex SQL and PL/SQL queries to ensure fast, accurate data retrieval across large datasets. Manage and maintain various databases including Oracle, Snowflake, Azure, and PostgreSQL. Build and maintain robust ETL pipelines for seamless data movement between systems. Ensure data integrity and consistency across ERP modules and third-party integrations. Identify performance bottlenecks and resolve system issues in ERP applications and underlying databases. Participate in Agile/Scrum ceremonies and contribute to continuous improvement initiatives. Develop automation scripts for routine tasks, deployments, or testing where applicable. Leverage scripting to improve operational efficiency and reduce manual effort. Qualifications A bachelors degree in computer science, or equivalent experience Overall 6+ years of experience in technical leadership Experience working with database technologies including PostgreSQL, Snowflake, Oracle and others Proficient in SQL, PL/SQL, and familiar with ETL tools or frameworks Excellent analytical and troubleshooting skills Strong communication and interpersonal skills with collaborative mindset Experience in programming languages like C# is a plus
Posted 1 month ago
3 - 5 years
4 - 8 Lacs
Hyderabad
Work from Office
Sr Associate Software Engineer – Finance What you will do The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Preferred Qualifications: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Experience with Anaplan platform, including building, managing, and optimizing models and workflows including scalable data integrations Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 month ago
5 - 10 years
25 - 27 Lacs
Hyderabad
Work from Office
What you will do Let’s do this. Let’s change the world. In this vital role you will manage and oversee the development of robust data and analytic solutions, while mentoring and guiding a team of data engineers. You will be responsible for leading the development, implementation, and management of enterprise-level data and analytics applications that support the organization's data-driven strategic initiatives. You will continuously strive for innovation in the technologies and practices used for data engineering and develop a team of expert data engineers. This role will closely collaborate with counterparts in US and EU. You will collaborate with cross-functional teams, including, platform, functional IT, and business stakeholders, to ensure that the solutions that are built align with business goals and are scalable, secure, and efficient. Roles & Responsibilities: Lead multiple data engineering teams that are responsible for product/project deliveries. Lead the planning, execution, and delivery of data and analytics solutions, ensuring they are completed on time, within scope, and within budget. Oversee the architecture, design, and implementation of scalable, high-performance data and analytic solutions (applications) that include data analysis, data ingestion, data transformation (data pipelines), and analytics. Develop and manage project plans, timelines, and budgets, and communicate progress to stakeholders regularly. Manage the RunOps for data pipelines and analytics solutions. Build and nurture strong relationships with stakeholders, emphasizing value-focused engagement and partnership to align data initiatives with broader business goals. Lead and motivate a high-performing data engineering team to deliver exceptional results. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and best practices. Collaborate with counterparts in US and EU and work with business functions, functional IT teams, and others to understand their data needs and ensure the solutions meet the requirements. Engage with business stakeholders to understand their needs and priorities, ensuring that data and analytics solutions built deliver real value and meet business objectives. Drive adoption of the data and analytics solutions by partnering with the business stakeholders and functional IT teams in rolling out change management, trainings, communications, etc. Stay abreast of emerging data technologies and explore opportunities for innovation. Talent Growth & People LeadershipLead, mentor, and manage a high-performing team of engineers, fostering an environment that encourages learning, collaboration, and innovation. Focus on nurturing future leaders and providing growth opportunities through coaching, training, and mentorship. Recruitment & Team ExpansionDevelop a comprehensive talent strategy that includes recruitment, retention, onboarding, and career development and build a diverse and inclusive team that drives innovation, aligns with Amgen's culture and values, and delivers business priorities Organizational LeadershipWork closely with senior leaders within the function and across the Amgen India site to align engineering goals with broader organizational objectives and demonstrate leadership by contributing to strategic discussions What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Master’s degree and 10 to 14 years of computer science and engineering preferred, other Engineering fields will be considered OR Bachelor’s degree and 14 to 16 years of computer science and engineering preferred, other Engineering fields will be considered OR Diploma and 18 to 20 years of computer science and engineering preferred, other Engineering fields will be considered 5+ years of experience managing a team of data engineers. 5+ years of experience in leading enterprise scale data and analytics solutions development. Demonstrated proficiency in leveraging cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Experience using Databricks, Snowflake, Python, PowerBI, Tableau. Proven ability to lead and develop high-performing data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Excellent leadership and project management skills, with the ability to manage multiple priorities simultaneously. Preferred Qualifications: Prior experience in data modeling especially star-schema modeling concepts. Familiarity with ontologies, information modeling, and graph databases. Experience working with agile development methodologies such as Scaled Agile. Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops. Education and Professional Certifications SAFe for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 month ago
5 - 10 years
11 - 15 Lacs
Hyderabad
Work from Office
Role NameBI Platform Administrator Job Posting Title BI Platform Administrator Workday Job Profile BI Platform Administrator Department Name Digital, Technology & Innovation Role GCF 4 LocationHyderabad, India Job Type Full-time ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 45 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description: The role is responsible for performance monitoring, maintenance, and reliable operation of BI Platforms, BI servers and database. This role involves managing BI Servers and User Admin Management for different environments, ensuring data is stored and retrieved efficiently, and safeguarding sensitive information and ensuring the uptime, performance, and security of IT infrastructure & Software maintenance. We are seeking a skilled BI Platform Administrator to manage, maintain, and optimize our enterprise Power BI and Tableau platforms . The ideal candidate will ensure seamless performance, governance, user access, platform upgrades, troubleshooting, and best practices across our BI environments. Roles & Responsibilities: Administer and maintain Power BI Service, Power BI Report Server, and Tableau Server/Online/any Cloud platforms (AWS, Azure/GCP). Preferred AWS Cloud experience. Configure, monitor, and optimize performance, capacity, and availability of BI platforms. Set up and manage user roles, permissions, and security policies. Manage BI platform upgrades, patches, and migrations. Monitor scheduled data refreshes and troubleshoot failures. Implement governance frameworks to ensure compliance with data policies. Collaborate with BI developers, data engineers, and business users for efficient platform usage. Automate routine administrative tasks using scripts (PowerShell, Python, etc.). Create and maintain documentation of configurations and operational procedures. Install, configure, and maintain BI tools on different operating systems, servers, and applications to ensure their reliability and performance Monitor Platform performance and uptime, addressing any issues that arise promptly to prevent service interruptions Implement and maintain security measures to protect Platforms from unauthorized access, vulnerabilities, and other threats Manage backup procedures and ensure data is securely backed up and recoverable in case of system failures Provide technical support to users, troubleshooting and resolving issues related to system access, performance, and software Apply operating system updates, patches, and configuration changes as necessary Maintain detailed documentation of Platform configurations, procedures, and change management Work closely with network administrators, database administrators, and other IT professionals to ensure that Platforms are integrated and functioning optimally Install, configure, and maintain database management Platforms (BI), ensuring services are reliable and perform optimally Monitor and optimize database performance, including query tuning, indexing, and resource allocation Maintain detailed documentation of Platform configurations, procedures, and policies Work closely with developers, Date Engineers, system administrators, and other IT staff to support database-related needs and ensure optimal platform performance Basic Qualifications and Experience: Over all 5+ years of experience in maintaining Administration on BI Platforms is preferred. 3+ years of experience administering Power BI Service and/or Power BI Report Server . 2+ years of experience administering Tableau Server or Tableau Cloud . Strong knowledge of Active Directory , SSO/SAML , and Role-Based Access Control (RBAC). Experience with platform monitoring and troubleshooting (Power BI Gateway logs, Tableau logs, etc.). Scripting experience (e.g., PowerShell , DAX , or Python ) for automation and monitoring. Strong understanding of data governance , row-level security , and compliance practices. Experience working with enterprise data sources (SQL Server, Snowflake, Oracle, etc.). Familiarity with capacity planning , load balancing , and scaling strategies for BI tools. Functional Skills: Should Have: Knowledge of Power BI Premium Capacity Management and Tableau Resource Management. Experience integrating BI platforms with CI/CD pipelines and DevOps tools. Hands-on experience in user adoption tracking , audit logging, and license management. Ability to conduct health checks and implement performance tuning recommendations. Understanding of multi-tenant environments or large-scale deployments . Good to Have: Experience with Power BI REST API or Tableau REST API for automation. Familiarity with AWS Services and/or AWS equivalents. Background in data visualization or report development for better user collaboration. Exposure to other BI tools (e.g., Looker, Qlik, MicroStrategy). Knowledge of ITIL practices or experience working in a ticket-based support environment. Experience in a regulated industry (finance, healthcare, etc.) with strong compliance requirements. Education & Experience : Master’s degree with 2-3+ years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 5-6+ years of experience in Business, Engineering, IT or related field OR Diploma with 8+ years of experience in Business, Engineering, IT or related field Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 1 month ago
5 - 10 years
7 - 11 Lacs
Hyderabad
Work from Office
Role NameBI Platform Administrator Job Posting Title BI Platform Administrator Workday Job Profile BI Platform Administrator Department Name Digital, Technology & Innovation Role GCF 4 LocationHyderabad, India Job Type Full-time ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 45 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description: The role is responsible for performance monitoring, maintenance, and reliable operation of BI Platforms, BI servers and database. This role involves managing BI Servers and User Admin Management for different environments, ensuring data is stored and retrieved efficiently, and safeguarding sensitive information and ensuring the uptime, performance, and security of IT infrastructure & Software maintenance. We are seeking a skilled BI Platform Administrator to manage, maintain, and optimize our enterprise Power BI and Tableau platforms . The ideal candidate will ensure seamless performance, governance, user access, platform upgrades, troubleshooting, and best practices across our BI environments. Roles & Responsibilities: Administer and maintain Power BI Service, Power BI Report Server, and Tableau Server/Online/any Cloud platforms (AWS, Azure/GCP). Preferred AWS Cloud experience. Configure, monitor, and optimize performance, capacity, and availability of BI platforms. Set up and manage user roles, permissions, and security policies. Manage BI platform upgrades, patches, and migrations. Monitor scheduled data refreshes and troubleshoot failures. Implement governance frameworks to ensure compliance with data policies. Collaborate with BI developers, data engineers, and business users for efficient platform usage. Automate routine administrative tasks using scripts (PowerShell, Python, etc.). Create and maintain documentation of configurations and operational procedures. Install, configure, and maintain BI tools on different operating systems, servers, and applications to ensure their reliability and performance Monitor Platform performance and uptime, addressing any issues that arise promptly to prevent service interruptions Implement and maintain security measures to protect Platforms from unauthorized access, vulnerabilities, and other threats Manage backup procedures and ensure data is securely backed up and recoverable in case of system failures Provide technical support to users, troubleshooting and resolving issues related to system access, performance, and software Apply operating system updates, patches, and configuration changes as necessary Maintain detailed documentation of Platform configurations, procedures, and change management Work closely with network administrators, database administrators, and other IT professionals to ensure that Platforms are integrated and functioning optimally Install, configure, and maintain database management Platforms (BI), ensuring services are reliable and perform optimally Monitor and optimize database performance, including query tuning, indexing, and resource allocation Maintain detailed documentation of Platform configurations, procedures, and policies Work closely with developers, Date Engineers, system administrators, and other IT staff to support database-related needs and ensure optimal platform performance Basic Qualifications and Experience: Over all 5+ years of experience in maintaining Administration on BI Platforms is preferred. 3+ years of experience administering Power BI Service and/or Power BI Report Server . 2+ years of experience administering Tableau Server or Tableau Cloud . Strong knowledge of Active Directory , SSO/SAML , and Role-Based Access Control (RBAC). Experience with platform monitoring and troubleshooting (Power BI Gateway logs, Tableau logs, etc.). Scripting experience (e.g., PowerShell , DAX , or Python ) for automation and monitoring. Strong understanding of data governance , row-level security , and compliance practices. Experience working with enterprise data sources (SQL Server, Snowflake, Oracle, etc.). Familiarity with capacity planning , load balancing , and scaling strategies for BI tools. Functional Skills: Should Have: Knowledge of Power BI Premium Capacity Management and Tableau Resource Management. Experience integrating BI platforms with CI/CD pipelines and DevOps tools. Hands-on experience in user adoption tracking , audit logging, and license management. Ability to conduct health checks and implement performance tuning recommendations. Understanding of multi-tenant environments or large-scale deployments . Good to Have: Experience with Power BI REST API or Tableau REST API for automation. Familiarity with AWS Services and/or AWS equivalents. Background in data visualization or report development for better user collaboration. Exposure to other BI tools (e.g., Looker, Qlik, MicroStrategy). Knowledge of ITIL practices or experience working in a ticket-based support environment. Experience in a regulated industry (finance, healthcare, etc.) with strong compliance requirements. Education & Experience : Master’s degree with 1-2+ years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 2-3+ years of experience in Business, Engineering, IT or related field OR Diploma with 5+ years of experience in Business, Engineering, IT or related field Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 1 month ago
2 - 3 years
0 - 0 Lacs
Thiruvananthapuram
Work from Office
Role Proficiency: Acts under very minimal guidance to develop error free code; testing and documenting applications Outcomes: Understand the applications features and component design and develop the same in accordance with user stories/requirements. Code debug test and document; and communicate product/component/feature development stages. Develop optimized code with appropriate approach and algorithms following standards and security guidelines independently Effectively interact with customers and articulate their input Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Mentor Developer I - Software Engineering to become more effective in their role Learn technology business domain and system domain as recommended by the project/account Set FAST goals and provide feedback to FAST goals of mentees Measures of Outcomes: Adherence to engineering processes and standards (coding standards) Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Meet the Defined productivity standards for project Completion of applicable technical/domain certifications Completion of all mandatory training requirements Outputs Expected: Configure: Follow configuration process Test: Create and conduct unit testing Domain relevance: Develop features and components with good understanding of the business problem being addressed for the client Manage Defects: Raise fix retest defects Estimate: Estimate time effort and resource dependence for one's own work Mentoring: Mentor junior developers in the team Set FAST goals and provide feedback to FAST goals of mentees Document: Create documentation for one's own work Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Status Reporting: Report status of tasks assigned Comply with project related reporting standards/process Release: Adhere to release management process Design: Understand the design/LLD and link it to requirements/user stories Code: Develop code with guidance for the above Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Develop user interfaces business software components and embedded software components Manage and guarantee high levels of cohesion and quality Use data models Estimate effort time required for own work Perform and evaluate tests in the customers or target environments Team player Good written and verbal communication abilities Proactively ask for and offer help Knowledge Examples: Appropriate software programs / modules Technical designing Programming languages DBMS Operating Systems and software platforms Integrated development environment (IDE) Agile methods Knowledge of customer domain and sub domain where problem is solved Additional Comments: Responsibilities and Skills - Manage incident response, root cause analysis, and ensure high system availability. - Oversee support for Hadoop, Spark, Hive, PySpark, Snowflake, and AWS EMR. - Maintain Python Flask APIs, Scala applications, and Airflow workflows. - Optimize SQL/HQL queries and manage shell/bash scripts. - Develop monitoring and ing systems, and provide detailed reporting. - 3+ years in production support/data engineering, with team leadership. - Expertise in Hadoop, Spark, Hive, PySpark, SQL, HQL, Python, Scala, and Python Flask API. - Proficiency in Unix/Linux, shell/bash scripting, Snowflake, and AWS EMR. - Experience with Airflow and incident management. - Strong problem-solving and communication skills. Required Skills Python,Pyspark,Airflow
Posted 1 month ago
3 - 5 years
4 - 8 Lacs
Gurugram
Work from Office
AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. AtAHEAD, we prioritize creating a culture of belonging,where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer,anddo not discriminatebased onan individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, maritalstatus,or any other protected characteristic under applicable law, whether actual or perceived. We embraceall candidatesthatwillcontribute to the diversification and enrichment of ideas andperspectives atAHEAD. Data Engineer (Internally known as a Sr. Associate Technical Consultant) AHEAD is looking for a Technical Consultant Data Engineer to work closely with our dynamic project teams (both on-site and remotely). This Data Engineer will be responsible for hands-on engineering of Data platforms that support our clients advanced analytics, data science, and other data engineering initiatives. This consultant will build, and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. The Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. As a Data Engineer, you will implement data pipelines to enable analytics and machine learning on rich datasets. Responsibilities: A Data Engineer should be able to build, operationalize and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as Kinesis, Lambda and other cloud native tools as required to address streaming use cases Engineers and supports data structures including but not limited to SQL and NoSQL databases Engineers and maintain ELT processes for loading data lake (Snowflake, Cloud Storage, Hadoop) Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Qualifications: 3+ years of professional technical experience 3+ years of hands-on Data Warehousing. 3+ years of experience building highly scalable data solutions using Hadoop, Spark, Databricks, Snowflake 2+ years of programming languages such as Python 3+ years of experience working in cloud environments (Azure) 2 years of experience in Redshift Strong client-facing communication and facilitation skills Key Skills: Python, Azure Cloud, Redshift, NoSQL, Git, ETL/ELT, Spark, Hadoop, Data Warehouse, Data Lake, Data Engineering, Snowflake, SQL/RDBMS, OLAP Why AHEAD: Through our daily work and internal groups like Moving Women AHEAD and RISE AHEAD, we value and benefit from diversity of people, ideas, experience, and everything in between. We fuel growth by stacking our office with top-notch technologies in a multi-million-dollar lab, by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. USA Employment Benefits include - Medical, Dental, and Vision Insurance - 401(k) - Paid company holidays - Paid time off - Paid parental and caregiver leave - Plus more! See benefits https://www.aheadbenefits.com/ for additional details. The compensation range indicated in this posting reflects the On-Target Earnings (OTE) for this role, which includes a base salary and any applicable target bonus amount. This OTE range may vary based on the candidates relevant experience, qualifications, and geographic location.
Posted 1 month ago
3 - 5 years
4 - 9 Lacs
Gurugram
Work from Office
AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. AtAHEAD, we prioritize creating a culture of belonging,where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer,anddo not discriminatebased onan individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, maritalstatus,or any other protected characteristic under applicable law, whether actual or perceived. We embraceall candidatesthatwillcontribute to the diversification and enrichment of ideas andperspectives atAHEAD. Data Engineer (Internally known as a Technical Consultant) AHEAD is looking for a Technical Consultant Data Engineer to work closely with our dynamic project teams (both on-site and remotely). This Data Engineer will be responsible for hands-on engineering of Data platforms that support our clients advanced analytics, data science, and other data engineering initiatives. This consultant will build, and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. The Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. As a Data Engineer, you will implement data pipelines to enable analytics and machine learning on rich datasets. Responsibilities: A Data Engineer should be able to build, operationalize and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as Kinesis, Lambda and other cloud native tools as required to address streaming use cases Engineers and supports data structures including but not limited to SQL and NoSQL databases Engineers and maintain ELT processes for loading data lake (Snowflake, Cloud Storage, Hadoop) Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Qualifications: 3+ years of professional technical experience 3+ years of hands-on Data Warehousing. 3+ years of experience building highly scalable data solutions using Hadoop, Spark, Databricks, Snowflake 2+ years of programming languages such as Python 3+ years of experience working in cloud environments (Azure) 2 years of experience in Redshift Strong client-facing communication and facilitation skills Key Skills: Python, Azure Cloud, Redshift, NoSQL, Git, ETL/ELT, Spark, Hadoop, Data Warehouse, Data Lake, Data Engineering, Snowflake, SQL/RDBMS, OLAP Why AHEAD: Through our daily work and internal groups like Moving Women AHEAD and RISE AHEAD, we value and benefit from diversity of people, ideas, experience, and everything in between. We fuel growth by stacking our office with top-notch technologies in a multi-million-dollar lab, by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. USA Employment Benefits include - Medical, Dental, and Vision Insurance - 401(k) - Paid company holidays - Paid time off - Paid parental and caregiver leave - Plus more! See benefits https://www.aheadbenefits.com/ for additional details. The compensation range indicated in this posting reflects the On-Target Earnings (OTE) for this role, which includes a base salary and any applicable target bonus amount. This OTE range may vary based on the candidates relevant experience, qualifications, and geographic location.
Posted 1 month ago
7 - 9 years
0 - 0 Lacs
Bengaluru
Work from Office
Job Title: Data Engineer/ DevOps - Enterprise Big Data Platform* Job Location: Bangalore Right to Hire requirement In this role, you will be part of a growing, global team of data engineers, who collaborate in DevOps mode, to enable business with state-of-the-art technology to leverage data as an asset and to take better informed decisions. The Enabling Functions Data Office Team is responsible for designing, developing, testing, and supporting automated end-to-end data pipelines and applications on Enabling Function's data management and analytics platform (Palantir Foundry, AWS and other components). The Foundry platform comprises multiple different technology stacks, which are hosted on Amazon Web Services (AWS) infrastructure or own data centers. Developing pipelines and applications on Foundry requires: * Proficiency in SQL / Scala / Python (Python required; all 3 not necessary) * Proficiency in PySpark for distributed computation * Familiarity with Ontology, Slate * Familiarity with Workshop App basic design/visual competency * Familiarity with common databases (e.g. Oracle, mySQL, Microsoft SQL). Not all types required This position will be project based and may work across multiple smaller projects or a single large project utilizing an agile project methodology. *Roles & Responsibilities:* * Tech / B.Sc./M.Sc. in Computer Science or related field and overall 6+ years of industry experience * Strong experience in Big Data & Data Analytics * Experience in building robust ETL pipelines for batch as well as streaming ingestion. * Big Data engineers with a firm grounding in Object Oriented Programming and an advanced level knowledge with commercial experience in Python, PySpark and SQL * Interacting with RESTful APIs incl. authentication via SAML and OAuth2 * Experience with test driven development and CI/CD workflows * Knowledge of Git for source control management * Agile experience in Scrum environments like Jira * Experience in visualization tools like Tableau or Qlik is a plus * Experience in Palantir Foundry, AWS or Snowflake is an advantage * Basic knowledge of Statistics and Machine Learning is favorable * Problem solving abilities * Proficient in English with strong written and verbal communication * Primary Responsibilities o Responsible for designing, developing, testing and supporting data pipelines and applications o Industrialize data pipelines o Establishes a continuous quality improvement process to systematically optimize data quality o Collaboration with various stakeholders incl. business and IT *Education* * Bachelor (or higher) degree in Computer Science, Engineering, Mathematics, Physical Sciences or related fields *Professional Experience* * 6+ years of experience in system engineering or software development * 3+ years of experience in engineering with experience in ETL type work with databases and Hadoop platforms. *Skills* |*Hadoop General*|Deep knowledge of distributed file system concepts, map-reduce principles and distributed computing. Knowledge of Spark and differences between Spark and Map-Reduce. Familiarity of encryption and security in a Hadoop cluster.| |*Data management / data structures*|Must be proficient in technical data management tasks, i.e. writing code to read, transform and store data XML/JSON knowledge Experience working with REST APIs| |*Spark*|Experience in launching spark jobs in client mode and cluster mode. Familiarity with the property settings of spark jobs and their implications to performance.| |*Application Development*|Familiarity with HTML, CSS, and JavaScript and basic design/visual competency| |*SCC/Git*|Must be experienced in the use of source code control systems such as Git| |*ETL* |Experience with developing ELT/ETL processes with experience in loading data from enterprise sized RDBMS systems such as Oracle, DB2, MySQL, etc.| |*Authorization*|Basic understanding of user authorization (Apache Ranger preferred)| |*Programming* |Must be at able to code in Python or expert in at least one high level language such as Java, C, Scala. Must have experience in using REST APIs| |*SQL* |Must be an expert in manipulating database data using SQL. Familiarity with views, functions, stored procedures and exception handling.| |*AWS* |General knowledge of AWS Stack (EC2, S3, EBS, ...)| |*IT Process Compliance*|SDLC experience and formalized change controls Working in DevOps teams, based on Agile principles (e.g. Scrum) ITIL knowledge (especially incident, problem and change management)| |*Languages* |Fluent English skills| *Specific information related to the position:* * Physical presence in primary work location (Bangalore) * Flexible to work CEST and US EST time zones (according to team rotation plan) * Willingness to travel to Germany, US and potentially other locations (as per project demand) Required Skills Bigdata,Pyspark,Agile,Palantir
Posted 1 month ago
5 - 10 years
17 - 20 Lacs
Gurugram
Work from Office
Management Level: Ind&Func AI Decision Science Manager Location: Gurgaon, Bangalore Must-Have Skills: Market Mix Modeling (MMM) Techniques, Optimization Algorithms for budget allocation and promotional channel optimization, Statistical and Probabilistic Methods:SVM, Decision Trees, Programming Languages & Tools:Python, NumPy, Pandas, Sklearn, AI/ML Models Development and Data Pipeline Management, Data Management within Snowflake (data layers, migration), Cloud Platforms experience (Azure, AWS, GCP). Good-to-Have Skills: Experience with Nonlinear Optimization Techniques, Experience in Data Migration (cloud to Snowflake), Proficiency in SQL and cloud-based technologies, Understanding of Econometrics/Statistical Modeling (Regression, Time Series, Multivariate Analysis). Job Summary We are seeking a skilled Ind & Func AI Decision Science Manager to join the Accenture Strategy & Consulting team in the Global Network – Data & AI practice. This role will focus on Market Mix Modeling (MMM), where you will be responsible for developing AI/ML models, optimizing promotional channels, managing data pipelines, and working on scaling marketing mix models across cloud platforms. This role offers an exciting opportunity to collaborate with leading financial clients and leverage cutting-edge technology to drive business impact and innovation. Roles & Responsibilities Engagement Execution Lead MMM engagements that involve optimizing promotional strategies, budget allocation, and marketing analytics solutions. Apply advanced statistical techniques and machine learning models to improve marketing effectiveness. Collaborate with clients to develop tailored market mix models, delivering data-driven insights to optimize their marketing budgets and strategies. Develop Proof of Concepts (PoC) for clients, including scoping, staffing, and execution phases. Practice Enablement Mentor and guide analysts, consultants, and managers to build their expertise in Market Mix Modeling and analytics. Contribute to the growth of the Analytics practice through knowledge sharing, staffing initiatives, and the development of new methodologies. Promote thought leadership in Marketing Analytics by publishing research and presenting at industry events. Opportunity Development Identify business development opportunities in marketing analytics and develop compelling business cases for potential clients. Work closely with deal teams to provide subject matter expertise in MMM, ensuring the development of high-quality client proposals and responses to RFPs. Client Relationship Development Build and maintain strong, trusted relationships with internal and external clients. Serve as a consultant to clients, offering strategic insights to optimize marketing spend and performance. Professional & Technical Skills 5+ years of experience in Market Mix Modeling (MMM) and associated optimization techniques. Strong knowledge of nonlinear optimization, AI/ML models, and advanced statistical techniques for marketing. Proficiency in programming languages such as Python, NumPy, Pandas, Sklearn, Seaborne, Pycaret, and Matplotlib. Experience with cloud platforms such as AWS, Azure, or GCP and data migration to Snowflake. Familiarity with econometrics/statistical modeling techniques (Regression, Hypothesis Testing, Time Series, Multivariate Analysis). Hands-on experience in managing data pipelines and deploying scalable machine learning architectures. Additional Information Master's degree in Statistics, Econometrics, Economics, or related fields from reputed universities. Ph.D. or M.Tech is a plus. Excellent communication and interpersonal skills to effectively collaborate with global teams and clients. Willingness to travel up to 40% of the time. Work on impactful projects to help clients optimize their marketing strategies through advanced data-driven insights. About Our Company | Accenture (do not remove the hyperlink) Qualification Experience: 5+ years of advanced experience in Market Mix Modeling (MMM) and related optimization techniques for promotional channels and budget allocation 2+ years for Analysts & 4+ years for Consultants of experience in consulting/analytics with reputed organizations Educational Qualification: Master’s degree in Statistics, Econometrics, Economics, or related fields from reputed institutions Ph.D. or M.Tech in relevant fields is an advantage
Posted 1 month ago
12 - 17 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Architecture Principles Good to have skills : Python (Programming Language), Snowflake Data Warehouse, Data Building Tool Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating and implementing innovative solutions to enhance business processes and meet application needs. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Expected to provide solutions to problems that apply across multiple teams Lead the team in implementing data architecture principles effectively Develop and maintain data models and databases Ensure data integrity and security measures are in place Professional & Technical Skills: Must To Have Skills: Proficiency in Data Architecture Principles Good To Have Skills: Experience with Python (Programming Language), Snowflake Data Warehouse, Data Building Tool Strong understanding of data architecture principles Experience in designing and implementing data solutions Knowledge of data modeling and database design Additional Information: The candidate should have a minimum of 12 years of experience in Data Architecture Principles This position is based at our Bengaluru office A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
3 - 8 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to design and implement data platform solutions. Develop and maintain data pipelines for efficient data processing. Optimize data storage and retrieval processes. Implement data security measures to protect sensitive information. Conduct performance tuning and troubleshooting of data platform components. Professional & Technical Skills: Must To Have Skills: Proficiency in Snowflake Data Warehouse. Strong understanding of cloud data platforms like AWS or Azure. Experience with SQL and database management systems. Hands-on experience with ETL tools for data integration. Knowledge of data modeling and schema design. Additional Information: The candidate should have a minimum of 3 years of experience in Snowflake Data Warehouse. This position is based at our Pune office. A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5 - 10 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), Data Building Tool Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your day will involve working on data solutions and collaborating with teams to optimize data processes. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Develop and maintain data pipelines Ensure data quality and integrity Implement ETL processes Professional & Technical Skills: Must To Have Skills: Proficiency in Snowflake Data Warehouse Good To Have Skills: Experience with Data Building Tool Strong understanding of data architecture Proficiency in SQL and database management Experience with cloud data platforms Knowledge of data modeling Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse This position is based at our Bengaluru office A 15 years full time education is required Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.
These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.
The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator
In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management
As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.