Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 7.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Specialist IS Bus Sys Analyst What you will do Let’s do this. Let’s change the world. In this vital role you will serve as a Business Systems Analyst supporting data analytics and visualization efforts across Amgen’s Global Procurement Operations . You will work with Product Owners, Data Engineers, and Business SMEs to deliver reporting and dashboards that unlock insights across Source-to-Settle processes. This role is critical to building Amgen’s analytics foundation for procurement transformation. . Roles & Responsibilities: Gather and document business reporting and analytics requirements across product teams. Design and build dashboards in tools such as Tableau, Power BI, or SAP Analytics Cloud. Translate procurement KPIs into actionable visualizations aligned with product team priorities. Collaborate with EDF and source system teams (SAP, Ariba, Apex, LevelPath) to define and access data sources. Perform data validation and user acceptance testing for new dashboards and enhancements. Enable self-service analytics capabilities for business users. Ensure compliance with Amgen’s data governance and visualization standards. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Master’s degree and 4 to 6 years of relevant experience OR Bachelor’s degree and 6 to 8 years of relevant experience OR Diploma and 10 to 12 years of professional experience. Preferred Qualifications: Must-Have Skills: Hands-on experience with Tableau, Power BI, or similar tools. Solid understanding of procurement data and Source-to-Settle processes. Ability to partner with cross-functional stakeholders to define KPIs and reporting needs. Good-to-Have Skills: Familiarity with data lake/data warehouse concepts (EDF preferred). Basic SQL or data wrangling skills to support ETL processes. Professional Certifications (Preferred): Tableau or Power BI certification; Agile certifications are a plus. Soft Skills: Exceptional communication and stakeholder management skills. Ability to collaborate across global teams and influence outcomes. Strong organizational skills to manage multiple priorities. Team-oriented and focused on delivering high-value solutions. Self-starter with a proactive approach to resolving challenges and driving innovation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 3 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Specialist IS Architect - Veeva Vault What you will do The role is responsible for developing and maintaining the overall data and IT architecture of Amgen’s Veeva Vault Platform . This role involves defining the architecture vision, creating roadmaps, and ensuring that IT strategies align with business goals, in Amgen’s R&D Veeva Vault Platform Agile Team. The role will be working closely with partners to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. The role will be involved in defining the Veeva Vault Platform Architecture strategy, guiding technology decisions, and ensuring that all implementations adhere to established architectural principles, Veeva’s and the Industry’s standard processes. Develop and maintain the Amgen’s enterprise Veeva Vault Platform architecture vision and strategy, ensuring alignment with business objectives Responsible for fostering platform reliability and efficiency through streamlined release management & execution, and establishing a consistent DevOps & CI/CD framework Accountable for designing and building customizations & configurations on the Platform as per the business needs including creating custom objects, fields, workflows and SDKs Responsible for strategizing Platform Integrations while adhering to consistent integration standards and patterns, designing integration workflows, building connectors, centralizing build & run, and driving a consistent DevOps model for integrations Identify and mitigate architectural risks, ensuring that the platform is scalable, secure, and resilient Maintain comprehensive documentation of the platform architecture, including principles, standards, user guides, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with partners to gather and analyze requirements, ensuring that solutions meet both business and technical needs Perform impact assessments, clearly define AS-IS and TO-BE states, and recommend platform upgrades following the new features and functionalities released by Veeva Design platform architecture that can scale to meet growing business needs and performance demands Develop and maintain logical, physical, and conceptual data models to support business needs Establish and enforce data standards, governance policies, and best practices Provide domain expertise in Veeva Vault to the team, offering guidance on architecture, solution design, and implementation challenges. Provide hands-on technical leadership in resolving complex technical issues and ensuring smooth deployment and system integration What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Master’s degree with 4 - 6 years of experience in Computer Science, IT or related field OR Bachelor’s degree with 6 - 8 years of experience in Computer Science, IT or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT or related field Must-Have Skills Solid understanding of architecting and deployment strategies for Veeva Vault Platforms/Products, Expertise in system integration, including APIs, middleware tools, and data migration between Vault and other systems. Strong knowledge of Data Lake technologies like Databricks, etc. Experience in Mulesoft and Python script development Extensive knowledge of enterprise architecture frameworks, technologies, and methodologies Experience with system integration and IT infrastructure Experience with data, change, and technology governance processes on the platform level Experience working in agile methodology, including Product Teams and Product Development models Proficiency in designing scalable, secure, and cost-effective solutions. Have stakeholder and team management skills Can lead and guide multiple teams to meet business needs and goals Experience with cloud-based architectures, AWS, Azure, or similar environments. Good-to-Have Skills: Good Knowledge of the Global Pharmaceutical Industry Understanding of GxP process Strong solution design and problem-solving skills Solid understanding of technology, function, or platform Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Working late hours Professional Certifications: Veeva Vault Platform Administrator (mandatory) SAFe – DevOps Practitioner (mandatory) SAFe for teams (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 3 weeks ago
2.0 - 5.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Overview Primary focus would be to lead development work within Azure Data Lake environment and other related ETL technologies, with the responsibility of ensuring on time and on budget delivery; Satisfying project requirements, while adhering to enterprise architecture standards. Role will lead key data lake projects and resources, including innovation related initiatives (e.g. adoption of technologies like Databricks, Presto, Denodo, Python,Azure data factory; database encryption; enabling rapid experimentation etc.)). This role will also have L3 and release management responsibilities for ETL processes Responsibilities Lead delivery of key Enterprise Data Warehouse and Azure Data Lake projects within time and budget Drive solution design and build to ensure scalability, performance and reuse of data and other components Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards. Manage work intake, prioritization and release timing; balancing demand and available resources. Ensure tactical initiatives are aligned with the strategic vision and business needs Oversee coordination and partnerships with Business Relationship Managers, Architecture and IT services teams to develop and maintain EDW and data lake best practices and standards along with appropriate quality assurance policies and procedures May lead a team of employee and contract resources to meet build requirements Set priorities for the team to ensure task completion Coordinate work activities with other IT services and business teams. Hold team accountable for milestone deliverables Provide L3 support for existing applications Release management Qualifications Experience Bachelors degree in Computer Science, MIS, Business Management, or related field 9 + years experience in Information Technology or Business Relationship Management 5 + years experience in Data Warehouse/Azure Data Lake 3 years experience in Azure data lake 2 years experience in project management Technical Skills Thorough knowledge of data warehousing / data lake concepts Hands on experience on tools like Azure data factory, databricks, pyspark and other data management tools on Azure Proven experience in managing Data, BI or Analytics projects Solutions Delivery experience - expertise in system development lifecycle, integration, and sustainability Experience in data modeling or database experience; Non-Technical Skills Excellent remote collaboration skills Experience working in a matrix organization with diverse priorities Experience dealing with and managing multiple vendors Exceptional written and verbal communication skills along with collaboration and listening skills Ability to work with agile delivery methodologies Ability to ideate requirements & design iteratively with business partners without formal requirements documentation Ability to budget resources and funding to meet project deliverables
Posted 3 weeks ago
5.0 - 10.0 years
10 - 12 Lacs
Noida
Work from Office
Job Title: Data Warehouse Developer II Location: Noida Department: IT Reports To: IT Supervisor/Manager/Director Direct Reports: No Job Summary The Data Warehouse Developer is responsible for designing, developing, maintaining, and supporting data transformation, integration, and analytics solutions across both cloud and on-premises environments. This role also provides 24x7 support for global systems. Key Responsibilities Understand and translate business requirements into technical solutions. Develop, test, debug, document, and implement ETL processes. Ensure performance, scalability, reliability, and security of solutions. Work with structured and semi-structured data across multiple platforms. Participate in Agile practices, including daily SCRUM meetings. Collaborate with infrastructure teams, DBAs, and software developers. Adhere to corporate standards for databases, data engineering, and analytics. Provide accurate time estimates, communicate status, and flag risks. Work across the full SDLC (analysis to support) using Agile methodologies. Demonstrate motivation, self-drive, and strong communication skills. Perform other related duties as assigned. Requirements Education & Experience Bachelors degree or equivalent work experience. 5+ years in software development/data engineering roles. At least 2 years of dedicated data engineering experience preferred. Technical Skills Strong experience with data transformations and manipulation. Ability to design data stores for analytics and other needs. Familiarity with traditional and modern data architectures (e.g., data lakes). Hands-on experience with cloud-native data tools (Azure preferred; GCP is a plus). Proficiency in traditional Microsoft ETL tools: SSIS, SSRS, SSAS, Power BI. Experience with Azure Data Factory is a plus. Soft Skills Ability to present and document clearly. Self-motivated and independent. Strong partnership and credibility with stakeholders. Work Environment Standard office setting. Use of standard office equipment.
Posted 3 weeks ago
4.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, Data Science, or a related field. 4+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must be strong in SQL Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.
Posted 3 weeks ago
6.0 - 11.0 years
6 - 10 Lacs
Bengaluru
Work from Office
We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, Data Science, or a related field. 6+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.
Posted 3 weeks ago
12.0 - 17.0 years
37 - 55 Lacs
Bengaluru
Work from Office
Career Area: Technology, Digital and Data : Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, you're joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We don't just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Role Definition Participates in defining functional design, development and systems architecture for Caterpillar s state-of-the-art digital platform aligning to common department goals in line with CAT Digital Strategy. Lead Digital Architect will assist teams that will build a world class platform to host a wide range of digital applications. Responsibilities Provide oversight for architecture assessment and design for infrastructure, information or integration domains that provide core capabilities for the enterprise. Lead Architecture design of end-to-end enterprise integrated systems that serves multiple business functions. Lead the design and implementation of enterprise data model and metadata structures for complex projects. Drive architecture and design patterns for various AI and ML related models, and services. Passion for applying state-of-the art AI algorithms to real-world problems. Stay up to date with the latest developments in AI and related fields, evaluate new developments and technologies for potential integration into existing and future architecture. Initiate and deliver technology evaluation and recommendations. Develop and maintain current, planned, and future state architecture blueprints. Lead in the identification and analysis of enterprise business drivers and requirements that drive the future state architecture. Skills: Analytical Thinking Knowledge of techniques and tools that promote effective analysis; ability to determine the root cause of organizational problems and create alternative solutions that resolve these problems. Effective Communications Understanding of effective communication concepts, tools and techniques; ability to effectively transmit, receive, and accurately interpret ideas, information, and needs through the application of appropriate communication behaviors. Data Architecture Knowledge of processes, techniques and factors that affect data architecture; ability to design blueprints on how to integrate data resources for business processes and functional support. Target Architecture Knowledge of target architecture; ability to develop the IT blueprint and roadmap while aligning the architecture and processes with business strategies and objectives. Analysis Knowledge of tools, methods, and techniques of requirement analysis; ability to elicit, analyze and record required business functionality and non-functionality requirements to ensure the success of a system or software development project. Top Candidates will have Four-year bachelor s degree from an accredited college or university in a Computer Science or Information Technology, or related field with 12+ years of experience Expertise with modern techniques for Language Understanding, Computer Vision, and other Machine Learning and AI technologies In-depth understand and expertise in one or more of the followingmassive real-time event and message processing, data lakes, Big Data transformations and analytics, complex e-commerce applications, IoT edge and cloud data processing, SQL, and NoSQL data modeling. Agile SDLC implementation in public cloud eco-system including environments management, peer review, CI/CD, resource optimization, etc. Relocation is available for this position. Posting Dates: May 29, 2025 - June 8, 2025 Caterpillar is an Equal Opportunity Employer. Not ready to applyJoin our Talent Community .
Posted 3 weeks ago
10.0 - 12.0 years
11 - 15 Lacs
Hyderabad
Work from Office
Job Information Job Opening ID ZR_2063_JOB Date Opened 17/11/2023 Industry Technology Job Type Work Experience 10-12 years Job Title Azure Data Architect City Hyderabad Province Telangana Country India Postal Code 500003 Number of Positions 4 LocationCoimbatore & Hyderabad : Key-Azure+ SQL+ ADF+ Databricks +design+ Architecture( Mandate) Total experience in data management area for 10 + years with Azure cloud data platform experience Architect with Azure stack (ADLS, AALS, Azure Data Bricks, Azure Streaming Analytics Azure Data Factory, cosmos DB & Azure synapse) & mandatory expertise on Azure streaming Analytics, Data Bricks, Azure synapse, Azure cosmos DB Must have worked experience in large Azure Data platform and dealt with high volume Azure streaming Analytics Experience in designing cloud data platform architecture, designing large scale environments 5 plus Years of experience architecting and building Cloud Data Lake, specifically Azure Data Analytics technologies and architecture is desired, Enterprise Analytics Solutions, and optimising real time 'big data' data pipelines, architectures and data sets. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 weeks ago
8.0 - 12.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2385_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 8-12 years Job Title Data modeller City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 3 weeks ago
5.0 - 7.0 years
15 - 25 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
About the Role: We are seeking a skilled and experienced Data Engineer to join our remote team. The ideal candidate will have 5-7 years of professional experience working with Python, PySpark, SQL, and Spark SQL, and will play a key role in building scalable data pipelines, optimizing data workflows, and supporting data-driven decision-making across the organization. Key Responsibilities: Design, build, and maintain scalable and efficient data pipelines using PySpark and SQL. Develop and optimize Spark jobs for large-scale data processing. Collaborate with data scientists, analysts, and other engineers to ensure data quality and accessibility. Implement data integration from multiple sources into a unified data warehouse or lake. Monitor and troubleshoot data pipelines and ETL jobs for performance and reliability. Ensure best practices in data governance, security, and compliance. Create and maintain technical documentation related to data pipelines and infrastructure. Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad,Remote
Posted 3 weeks ago
7.0 - 12.0 years
20 - 35 Lacs
Pune, Bengaluru
Hybrid
Hi All , we have senior position for databricks expert Job Location :Pune and Bangalore(hybrid) Perks :pick and drop provided Role & responsibilities Kindly Note : Overall experience should be 7 Yrs+ and immediate joiner Data Engineering - Data pipeline development using Azure Databricks 5+ years • Optimizing data processing performance, efficient resource utilization and execution time. Workflow orchestration 5+ years • Databricks features like Databricks SQL, Delta Lake, and Workflows to orchestrate and manage complex data workflows – 5 + years • Data modelling – 5 + Years 6. Nice to Haves: Knowledge of PySparks, Good knowledge of data warehousing
Posted 3 weeks ago
10.0 - 18.0 years
22 - 27 Lacs
Hyderabad
Remote
Role: Data Architect / Data Modeler - ETL, Snowflake, DBT Location: Remote Duration: 14+ Months Timings: 5:30pm IST to 1:30am IST Note: Looking for Immediate Joiners Job Summary: We are seeking a seasoned Data Architect / Modeler with deep expertise in Snowflake , DBT , and modern data architectures including Data Lake , Lakehouse , and Databricks platforms. The ideal candidate will be responsible for designing scalable, performant, and reliable data models and architectures that support analytics, reporting, and machine learning needs across the organization. Key Responsibilities: Architect and design data solutions using Snowflake , Databricks , and cloud-native lakehouse principles . Lead the implementation of data modeling best practices (star/snowflake schemas, dimensional models) using DBT . Build and maintain robust ETL/ELT pipelines supporting both batch and real-time data processing. Develop data governance and metadata management strategies to ensure high data quality and compliance. Define data architecture frameworks, standards, and principles for enterprise-wide adoption. Work closely with business stakeholders, data engineers, analysts, and platform teams to translate business needs into scalable data solutions. Provide guidance on data lake and data warehouse integration , helping bridge structured and unstructured data needs. Establish data lineage, documentation, and maintain architecture diagrams and data dictionaries. Stay up to date with industry trends and emerging technologies in cloud data platforms and recommend improvements. Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or data modeling roles. Strong experience with Snowflake including performance tuning, security, and architecture. Hands-on experience with DBT (Data Build Tool) for building and maintaining data transformation workflows. Deep understanding of Lakehouse Architecture , Data Lake implementations, and Databricks . Solid grasp of dimensional modeling , normalization/denormalization strategies, and data warehouse design principles. Experience with cloud platforms (e.g., AWS, Azure, or GCP) Proficiency in SQL and scripting languages (e.g., Python). Familiarity with data governance frameworks , data catalogs, and metadata management tools.
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Skill required: Delivery - Marketing Analytics and Reporting Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years What would you do? Data & AIAnalytical processes and technologies applied to marketing-related data to help businesses understand and deliver relevant experiences for their audiences, understand their competition, measure and optimize marketing campaigns, and optimize their return on investment. What are we looking for? Data Analytics - with a specialization in the marketing domain*Domain Specific skills* Familiarity with ad tech and B2B sales*Technical Skills* Proficiency in SQL and Python Experience in efficiently building, publishing & maintaining robust data models & warehouses for self-ser querying, advanced data science & ML analytic purposes Experience in conducting ETL / ELT with very large and complicated datasets and handling DAG data dependencies. Strong proficiency with SQL dialects on distributed or data lake style systems (Presto, BigQuery, Spark/Hive SQL, etc.), including SQL-based experience in nested data structure manipulation, windowing functions, query optimization, data partitioning techniques, etc. Knowledge of Google BigQuery optimization is a plus. Experience in schema design and data modeling strategies (e.g. dimensional modeling, data vault, etc.) Significant experience with dbt (or similar tools), Spark-based (or similar) data pipelines General knowledge of Jinja templating in Python. Hands-on experience with cloud provider integration and automation via CLIs and APIs*Soft Skills* Ability to work well in a team Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualifications Any Graduation
Posted 3 weeks ago
3.0 - 8.0 years
20 - 32 Lacs
Bengaluru
Work from Office
Translate ideas&designs into running code Automate business processes using Office 365 Power Automate,Power Apps,Power BI Perform softwaredesign,debugging,testing,deployment Implement custom solutions leveragingCanvasApps,Model-Driven Apps,Office 365 Required Candidate profile production-level app development exp using PowerApps,Power Automate,Power BI Exp in C#, JavaScript, jQuery, Bootstrap, HTML Exp in SAP HANA,ETLprocesses,data modeling,data cleaning,data pre-processing
Posted 3 weeks ago
3.0 - 6.0 years
5 - 15 Lacs
Kochi, Thiruvananthapuram
Hybrid
Hiring for Azure Data Engineer in Kochi Location Experience - 3 to 6 years Location - Kochi JD Overall 3+ years of IT experience with 2+ Relevant experience in Azure Data Factory (ADF) and good hands-on with Exposure to latest ADF Version Hands-on experience on Azure functions & Azure synapse (Formerly SQL Data Warehouse) Should have project experience in Azure Data Lake / Blob (Storage purpose) Should have basic understanding on Batch Account configuration, various control options Sound knowledge in Data Bricks & Logic Apps Should be able to coordinate independently with business stake holders and understand the business requirements, implement the requirements using ADF Interested candidates please share your updated resume with below details at Smita.Dattu.Sarwade@gds.ey.com Total Experience - Relevant Experience - Current Location - Preferred Location - Current Ctc Expected Ctc – Notice period -
Posted 3 weeks ago
6.0 - 9.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Overview We are looking for a experienced GCP BigQuery Lead to architect, develop, and optimize data solutions on Google Cloud Platform, with a strong focus on Big Query . role involves leading warehouse setup initiatives, collaborating with stakeholders, and ensuring scalable, secure, and high-performance data infrastructure. Responsibilities Lead the design and implementation of data pipelines using BigQuery , Datorama , Dataflow , and other GCP services. Architect and optimize data models and schemas to support analytics, reporting use cases. Implement best practices for performance tuning , partitioning , and cost optimization in BigQuery. Collaborate with business stakeholders to translate requirements into scalable data solutions. Ensure data quality, governance, and security across all big query data assets. Automate workflows using orchestration tools. Mentor junior resource and lead script reviews, documentation, and knowledge sharing. Qualifications 6+ years of experience in data analytics, with 3+ years on GCP and BigQuery. Strong proficiency in SQL , with experience in writing complex queries and optimizing performance. Hands-on experience with ETL/ELT tools and frameworks. Deep understanding of data warehousing , dimensional modeling , and data lake architectures . Good Exposure with data governance , lineage , and metadata management . GCP data engineer certification is a plus. Experience with BI tools (e.g., Looker, Power BI). Good communication and team lead skills.
Posted 3 weeks ago
18.0 - 23.0 years
15 - 19 Lacs
Hyderabad
Work from Office
About the Role We are seeking a highly skilled and experienced Data Architect to join our team. The ideal candidate will have at least 18 years of experience in Data engineering and Analytics and a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will be expected to design, create, deploy, and manage Blackbaud’s data architecture. This role has considerable technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. This individual acts as an evangelist for proper data strategy with other teams at Blackbaud and assists with the technical direction, specifically with data, of other projects. What you'll do Develop and direct the strategy for all aspects of Blackbaud’s Data and Analytics platforms, products and services Set, communicate and facilitate technical direction more broadly for the AI Center of Excellence and collaboratively beyond the Center of Excellence Design and develop breakthrough products, services or technological advancements in the Data Intelligence space that expand our business Work alongside product management to craft technical solutions to solve customer business problems. Own the technical data governance practices and ensures data sovereignty, privacy, security and regulatory compliance. Continuously challenging the status quo of how things have been done in the past. Build data access strategy to securely democratize data and enable research, modelling, machine learning and artificial intelligence work. Help define the tools and pipeline patterns our engineers and data engineers use to transform data and support our analytics practice Work in a cross-functional team to translate business needs into data architecture solutions. Ensure data solutions are built for performance, scalability, and reliability. Mentor junior data architects and team members. Keep current on technologydistributed computing, big data concepts and architecture. Promote internally how data within Blackbaud can help change the world. What you'll bring 18+ years of experience in data and advanced analytics At least 8 years of experience working on data technologies in Azure/AWS Expertise in SQL and Python Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. Expertise in Databricks, Microsoft Fabric Strong understanding of data modeling, data warehousing, data lakes, data mesh and data products. Experience with machine learning Excellent communication and leadership skills. Preferred Qualifications Experience working with .Net/Java and Microservice Architecture Stay up to date on everything Blackbaud, follow us on Linkedin, X, Instagram, Facebook and YouTube Blackbaud is a digital-first company which embraces a flexible remote or hybrid work culture. Blackbaud supports hiring and career development for all roles from the location you are in today! Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.
Posted 3 weeks ago
4.0 - 9.0 years
5 - 15 Lacs
Hyderabad, Chennai
Work from Office
Key skills: Python, SQL, Pyspark, Databricks, AWS (Manadate) Added advantage: Life sciences/Pharma Roles and Responsibilities 1.Data Pipeline Development: Design, build, and maintain scalable data pipelines for ingesting, processing, and transforming large datasets from diverse sources into usable formats. 2.Data Integration and Transformation: Integrate data from multiple sources, ensuring data is accurately transformed and stored in optimal formats (e.g., Delta Lake, Redshift, S3). 3.Performance Optimization: Optimize data processing and storage systems for cost efficiency and high performance, including managing compute resources and cluster configurations. 4.Automation and Workflow Management: Automate data workflows using tools like Airflow, Data bricks APIs, and other orchestration technologies to streamline data ingestion, processing, and reporting tasks. 5.Data Quality and Validation: Implement data quality checks, validation rules, and transformation logic to ensure the accuracy, consistency, and reliability of data. 6.Cloud Platform Management: Manage and optimize cloud infrastructure (AWS, Databricks) for data storage, processing, and compute resources, ensuring seamless data operations. 7.Migration and Upgrades: Lead migrations from legacy data systems to modern cloud-based platforms, ensuring smooth transitions and enhanced scalability. 8.Cost Optimization: Implement strategies for reducing cloud infrastructure costs, such as optimizing resource usage, setting up lifecycle policies, and automating cost alerts. 9.Data Security and Compliance: Ensure secure access to data by implementing IAM roles and policies, adhering to data security best practices, and enforcing compliance with organizational standards. 10.Collaboration and Support: Work closely with data scientists, analysts, and business teams to understand data requirements and provide support for data-related tasks.
Posted 3 weeks ago
4.0 - 9.0 years
0 - 1 Lacs
Ahmedabad
Work from Office
Skills & Tools: Platforms: Oracle Primavera P6/EPPM, Microsoft Project Online, Planisware, Clarity PPM Integration Tools: APIs (REST/SOAP), ETL tools (Informatica, Talend), Azure Data Factory IAM/Security: Azure AD, Okta, SAML/OAuth, RBAC, SIEM tools Data Technologies: Data Lakes (e.g., AWS S3, Azure Data Lake), SQL, Power BI/Tableau Languages: Python, SQL, PowerShell, JavaScript (for scripting and integrations) Role & responsibilities Technical Consultant EPPM Platform, Cybersecurity, and Data Integrations Role Overview: As a Technical Consultant, you will be responsible for end-to-end setup and configuration of the Enterprise Project Portfolio Management (EPPM) platform, ensuring secure, efficient, and scalable integrations with enterprise systems including Data Lakes , access control tools, and project governance frameworks. You will work at the intersection of technology, security , and project operations , enabling the business to manage project portfolios effectively. Preferred candidate profile
Posted 3 weeks ago
6.0 - 11.0 years
25 - 30 Lacs
Bengaluru
Hybrid
Mandatory Skills : Data engineer , AWS Athena, AWS Glue,Redshift,Datalake,Lakehouse,Python,SQL Server Must Have Experience: 6+ years of hands-on data engineering experience Expertise with AWS services: S3, Redshift, EMR, Glue, Kinesis, DynamoDB Building batch and real-time data pipelines Python, SQL coding for data processing and analysis Data modeling experience using Cloud based data platforms like Redshift, Snowflake, Databricks Design and Develop ETL frameworks Nice-to-Have Experience : ETL development using tools like Informatica, Talend, Fivetran Creating reusable data sources and dashboards for self-service analytics Experience using Databricks for Spark workloads or Snowflake Working knowledge of Big Data Processing CI/CD setup Infrastructure-as-code implementation Any one of the AWS Professional Certification
Posted 3 weeks ago
9.0 - 14.0 years
55 - 60 Lacs
Bengaluru
Hybrid
Dodge Position Title: Technology Lead STG Labs Position Title: Location: Bangalore, India About Dodge Dodge Construction Network exists to deliver the comprehensive data and connections the construction industry needs to build thriving communities. Our legacy is deeply rooted in empowering our customers with transformative insights, igniting their journey towards unparalleled business expansion and success. We serve decision-makers who seek reliable growth and who value relationships built on trust and quality. By combining our proprietary data with cutting-edge software, we deliver to our customers the essential intelligence needed to excel within their respective landscapes. We propel the construction industry forward by transforming data into tangible guidance, driving unparalleled advancement. Dodge is the catalyst for modern construction. https://www.construction.com/ About Symphony Technology Group (STG) STG is a Silicon Valley (California) based private equity firm that has a long and successful track record of transforming high potential software and software-enabled services companies, as well as insights-oriented companies into definitive market leaders. The firm brings expertise, flexibility, and resources to build strategic value and unlock the potential of innovative companies. Partnering to build customer-centric, market winning portfolio companies, STG creates sustainable foundations for growth that bring value to all existing and future stakeholders. The firm is dedicated to transforming and building outstanding technology companies in partnership with world class management teams. With over $5.0 billion in assets under management, including a recently raised $2.0 billion fund. STGs expansive portfolio has consisted of more than 30 global companies. STG Labs is the incubation center for many of STG’s portfolio companies, building their engineering, professional services, and support delivery teams in India. STG Labs offers an entrepreneurial start-up environment for software and AI engineers, data scientists and analysts, project and product managers and provides a unique opportunity to work directly for a software or technology company. Based in Bangalore, STG Labs supports hybrid working. https://stg.com Roles and Responsibilities Lead the design, deployment, and management of data mart and analytics infrastructure leveraging AWS services Implement and manage CI/CD pipelines using industry-leading DevOps practices and tools Design, implement, and oversee API architecture, ensuring robust, scalable, and secure REST API development using AWS API Gateway Collaborate closely with data engineers, architects, and analysts to design highly performant and scalable data solutions. Mentor and guide engineering teams, fostering a culture of continuous learning and improvement. Optimize cloud resources for cost-efficiency, scalability, and reliability. Establish best practices and standards for AWS infrastructure, DevOps processes, API design, and data analytics workflows. Qualifications Hands-on working knowledge and experience is required in: Data Structures Memory Management Basic Algos (Search, Sort, etc) AWS Data Services: Redshift, Glue, EMR, Athena, Lake Formation, Lambda Infrastructure-as-Code Tools: Terraform, AWS CloudFormation Scripting Languages: Python, Bash, SQL DevOps Tooling: Docker, Kubernetes, Jenkins, Bitbucket - must be comfortable in CLI / terminal environments. Command Line / Terminal Environments AWS Security Best Practices Scalable Data Marts, Analytics Systems, and RESTful APIs Hands-on working knowledge and experience is preferred in: Container Orchestration: Kubernetes, EKS Data Visualization & Warehousing: Tableau, Data Warehouse Machine Learning & Big Data Pipelines Certifications Preferred : AWS Certifications (Solutions Architect Professional, DevOps Engineer) (Preferred Skill).
Posted 3 weeks ago
10.0 - 17.0 years
9 - 19 Lacs
Bengaluru
Remote
Azure Data Engineer Skills Req: Azure Data Engineer Big Data , hadoop Develop and maintain Data Pipelines using Azure services like Data Factory PysparkSynapse, Data Bricks Adobe,Spark Scala etc
Posted 3 weeks ago
7.0 - 9.0 years
1 - 6 Lacs
Bengaluru
Work from Office
Designation: Data Engineer Job Location: Bangalore About Digit Insurance : Digits mission is to ‘Make Insurance, Simple’. We are backed by Fairfax- one of the largest global investment firms. We have also been ranked as 'LinkedIn top 5 startup of 2018’ and 2019 and are the fastest growing insurance company. We have also been certified as a Great Place to Work! Digit has entered the Unicorn club with a valuation of $1.9 billion and while most companies take about a decade to get here, we have achieved this in just 3 years. And we truly believe that this has happened as a result of the mission of the company i.e. to make insurance simple along with the sheer hard work & endeavors of our employees. We are re-imagining products and redesigning processes to provide simple and transparent insurance solutions, that matter to consumers. We are building a technology-driven platform that can offer customized products at reduced cost and provide a great customer service. We are also the only cloud based Insure-tech company, with a very focused approach towards in house development. We are using new age Technologies like Java Microservices, Full Stack, Angular 6, Python, React Native, DB2, Machine Learning, Data Science, cloud native architecture in AWS, Azure. We are headquartered in Bangalore, Pune, Trivandrum and across India. The team is a great mix of Industry veterans who know what’s working and new age technology specialists who know what could be improved. What are we looking for ? We are looking for candidates to join us as a part of our Data Science team as Data Engineer. Total Experience range? 5 to 8 years of experience in SQL, Python scripting any cloud technologies. Skill Set: Strong proficient in coding specially in Python or any other scripting language. Working knowledge on Linux OS or shell scripting Hands on experience in SQL Working knowledge on any cloud technologies. Exposure to Data-Lake concepts. Roles and Responsibilities : Responsible for End-to-End development of projects which includes understanding requirement, designing solution, implementing, testing and maintenance. Responsible for resolving issues that might occur in existing solutions. Responsible for optimization of existing solutions to save time and resources. Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
3.0 - 6.0 years
7 - 9 Lacs
Jaipur, Bengaluru
Work from Office
We are seeking a skilled Data Engineer to join ,The ideal candidate will have strong experience with Databricks, Python, SQL, Spark, PySpark, DBT, and AWS, and a solid understanding of big data ecosystems, data lake architecture, and data modeling.
Posted 3 weeks ago
7.0 - 12.0 years
3 - 7 Lacs
Gurugram
Work from Office
AHEAD builds platforms for digital business. By weaving together advances in cloud infrastructure, automation and analytics, and software delivery, we help enterprises deliver on the promise of digital transformation. AtAHEAD, we prioritize creating a culture of belonging,where all perspectives and voices are represented, valued, respected, and heard. We create spaces to empower everyone to speak up, make change, and drive the culture at AHEAD. We are an equal opportunity employer,anddo not discriminatebased onan individual's race, national origin, color, gender, gender identity, gender expression, sexual orientation, religion, age, disability, maritalstatus,or any other protected characteristic under applicable law, whether actual or perceived. We embraceall candidatesthatwillcontribute to the diversification and enrichment of ideas andperspectives atAHEAD. AHEAD is looking for a Sr. Data Engineer (L3 support) to work closely with our dynamic project teams (both on-site and remotely). This Data Engineer will be responsible for hands-on engineering of Data platforms that support our clients advanced analytics, data science, and other data engineering initiatives. This consultant will build and support modern data environments that reside in the public cloud or multi-cloud enterprise architectures. The Data Engineer will have responsibility for working on a variety of data projects. This includes orchestrating pipelines using modern Data Engineering tools/architectures as well as design and integration of existing transactional processing systems. The appropriate candidate must be a subject matter expert in managing data platforms. Responsibilities: A Sr. Data Engineer should be able to build, operationalize and monitor data processing systems Create robust and automated pipelines to ingest and process structured and unstructured data from various source systems into analytical platforms using batch and streaming mechanisms leveraging cloud native toolset Implement custom applications using tools such as EventHubs, ADF and other cloud native tools as required to address streaming use cases Engineers and maintain ELT processes for loading data lake (Cloud Storage, data lake gen2) Leverages the right tools for the right job to deliver testable, maintainable, and modern data solutions Respond to customer/team inquiries and escalations and assist in troubleshooting and resolving challenges Works with other scrum team members to estimate and deliver work inside of a sprint Research data questions, identifies root causes, and interacts closely with business users and technical resources Should possess ownership and leadership skills to collaborate effectively with Level 1 and Level 2 teams. Must have experience in raising tickets with Microsoft and engaging with them to address any service or tool outages in production. Qualifications: 7+ years of professional technical experience 5+ years of hands-on Data Architecture and Data Modelling SME level 5+ years of experience building highly scalable data solutions using Azure data factory, Spark, Databricks, Python 5+ years of experience working in cloud environments (AWS and/or Azure) 3+ years of programming languages such as Python, Spark and Spark SQL. Should have strong knowledge on architecture of ADF and Databricks. Able to work with Level1 and Level 2 teams to resolve platform outages in production environments. Strong client-facing communication and facilitation skills Strong sense of urgency, ability to set priorities and perform the job with little guidance Excellent written and verbal interpersonal skills and the ability to build and maintain collaborative and positive working relationships at all levels Strong interpersonal and communication skills (Written and oral) required Should be able to work in shifts Should have knowledge on azure Dev Ops process. Key Skills: Azure Data Factory, Azure Data bricks, Python, ETL/ELT, Spark, Data Lake, Data Engineering, EventHubs, Azure delta, Spark streaming Why AHEAD: Through our daily work and internal groups like Moving Women AHEAD and RISE AHEAD, we value and benefit from diversity of people, ideas, experience, and everything in between. We fuel growth by stacking our office with top-notch technologies in a multi-million-dollar lab, by encouraging cross department training and development, sponsoring certifications and credentials for continued learning. USA Employment Benefits include - Medical, Dental, and Vision Insurance - 401(k) - Paid company holidays - Paid time off - Paid parental and caregiver leave - Plus more! See benefits https://www.aheadbenefits.com/ for additional details. The compensation range indicated in this posting reflects the On-Target Earnings (OTE) for this role, which includes a base salary and any applicable target bonus amount. This OTE range may vary based on the candidates relevant experience, qualifications, and geographic location.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane