Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description & Summary: We are looking for a skilled Azure Cloud Data Engineer with strong expertise in Python programming , Databricks , and advanced SQL to join our team in Noida . The candidate will be responsible for designing, developing, and optimizing scalable data solutions on the Azure cloud platform. You will play a critical role in building data pipelines and transforming complex data into actionable insights by leveraging cloud-native tools and technologies. Level: Senior Consultant / Manager Location: Noida LOS: Competency: Data & Analytics Skill: Azure Data Engineering Job Position Title: Azure Cloud Data Engineer with Python Programming – Senior Consultant/Manager (6+ Years) Responsibilities: · Design, develop, and manage scalable and secure data pipelines using Azure Databricks and Azure Data Factory. · Write clean, efficient, and reusable code primarily in Python for cloud automation, data processing, and orchestration. · Architect and implement cloud-based data solutions, integrating structured and unstructured data sources. · Build and optimize ETL workflows and ensure seamless data integration across platforms. · Develop data models using normalization and denormalization techniques to support OLTP and OLAP systems. · Manage Azure-based storage solutions including Azure Data Lake and Blob Storage. · Troubleshoot performance bottlenecks in data flows and ETL processes. · Integrate advanced analytics and support BI use cases within the Azure ecosystem. · Lead code reviews and ensure adherence to version control practices (e.g., Git). · Contribute to the design and deployment of enterprise-level data warehousing solutions. · Stay current with Azure cloud technologies and Python ecosystem updates to adopt best practices and emerging tools. Mandatory skill sets: · Strong Python programming skills (Must-Have) – advanced scripting, automation, and cloud SDK experience · Strong SQL skills (Must-Have) · Azure Databricks (Must-Have) · Azure Data Factory · Azure Blob Storage / Azure Data Lake Storage · Apache Spark (hands-on experience) · Data modeling (Normalization & Denormalization) · Data warehousing and BI tools integration · Git (Version Control) · Building scalable ETL pipelines Preferred skill sets (Good to Have): · Understanding of OLTP and OLAP environments · Experience with Kafka and Hadoop · Azure Synapse Analytics · Azure DevOps for CI/CD integration · Agile delivery methodologies Years of experience required: · 6+ years of overall experience in cloud engineering or data engineering roles, with at least 2-3 years of hands-on experience with Azure cloud services. · Proven track record of strong Python development with at least 2-3 years of hands-on experience. Education qualification: BE/B.Tech/MBA/MCA
Posted 3 days ago
5.0 years
0 Lacs
India
On-site
Job Title: IBM TM1 Professional Location: PAN India Job Type: Hybrid About the Role: We are seeking an experienced IBM TM1 Professional to join our client's team and contribute to the design, development, and support of their financial planning, budgeting, and forecasting solutions using IBM TM1 (also known as IBM Planning Analytics). The ideal candidate will have hands-on experience in IBM TM1 development, data modeling, and performance tuning to deliver high-quality financial planning solutions to our clients. Key Responsibilities: Design, develop, and maintain IBM TM1 cubes, models, and processes for budgeting, forecasting, and reporting. Work with business stakeholders to gather requirements and translate them into effective TM1 solutions. Create and maintain complex TM1 rules, TI (TurboIntegrator) processes, and active forms. Integrate TM1 with external systems (e.g., ERP, CRM, data warehouses) for data exchange. Optimize the performance of TM1 models and ensure high availability and scalability. Provide troubleshooting and issue resolution for TM1-related problems. Create and maintain detailed documentation for TM1 models, processes, and reports. Train and mentor junior team members and provide technical guidance to users. Conduct performance testing and capacity planning to ensure optimal TM1 system performance. Collaborate with business and IT teams to drive continuous improvement in planning and reporting processes. Required Skills and Experience: Strong 5+ years of experience with IBM TM1 (Planning Analytics), including cube design, rule writing, and Turbo Integrator scripting. Proficiency in TM1 development, including creating complex reports and integrating with other tools. Understanding of financial planning processes and experience working with business stakeholders in a financial environment. Knowledge of OLAP concepts, multidimensional data models, and performance tuning for TM1 environments. Experience in managing TM1 environments, including backups, upgrades, and troubleshooting. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Desired Skills: Experience with IBM Planning Analytics Workspace (PAW) and/or Planning Analytics for Excel (PAX). Knowledge of other BI tools (e.g., Tableau, Power BI, Cognos). Familiarity with SQL and relational database concepts. Experience with cloud-based TM1 deployments (e.g., IBM Cloud). Certification in IBM TM1/Planning Analytics is a plus.
Posted 3 days ago
2.0 - 6.0 years
14 - 15 Lacs
Hyderabad
Work from Office
Career Category Engineering Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Associate Data Engineer What you will do Let s do this. Let s change the world. In this vital role we seek a skilled Data Engineer to build and optimize our data infrastructure. As a key contributor, you will collaborate closely with cross-functional teams to design and implement robust data pipelines that efficiently extract, transform, and load data into our AWS-based data lake and data warehouse. Your expertise will be instrumental in empowering data-driven decision making through advanced analytics and predictive modeling. Roles & Responsibilities: Building and optimizing data pipelines, data warehouses, and data lakes on the AWS and Databricks platforms. Managing and maintaining the AWS and Databricks environments. Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring. Maintain system uptime and optimal performance Working closely with cross-functional teams to understand business requirements and translate them into technical solutions. Exploring and implementing new tools and technologies to enhance ETL platform performance. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Bachelor s degree and 2 to 6 years. Functional Skills: Must-Have Skills: Proficient in SQL for extracting, transforming, and analyzing complex datasets from both relational and columnar data stores. Proven ability to optimize query performance on big data platforms. Proficient in leveraging Python, PySpark, and Airflow to build scalable and efficient data ingestion, transformation, and loading processes. Ability to learn new technologies quickly. Strong problem-solving and analytical skills. Excellent communication and teamwork skills. Good-to-Have Skills: Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc. ), CI/CD (Jenkins, Maven etc. ), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 4 days ago
8.0 - 13.0 years
13 - 17 Lacs
Bengaluru
Work from Office
About Netskope Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter @Netskope . As a Staff Engineer on the Data Engineering Team you ll be working on some of the hardest problems in the field of Data, Cloud and Security with a mission to achieve the highest standards of customer success. You will be building blocks of technology that will define Netskope s future. You will leverage open source Technologies around OLAP, OLTP, Streaming, Big Data and ML models. You will help design, and build an end-to-end system to manage the data and infrastructure used to improve security insights for our global customer base. You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Conceiving and building services used by Netskope products to validate, transform, load and perform analytics of large amounts of data using distributed systems with cloud scale and reliability. Helping other teams architect their applications using services from the Data team wile using best practices and sound designs. Evaluating many open source technologies to find the best fit for our needs, and contributing to some of them. Working with the Application Development and Product Management teams to scale their underlying services Providing easy-to-use analytics of usage patterns, anticipating capacity issues and helping with long term planning Learning about and designing large-scale, reliable enterprise services. Working with great people in a fun, collaborative environment. Creating scalable data mining and data analytics frameworks using cutting edge tools and techniques Required skills and experience 8+ years of industry experience building highly scalable distributed Data systems Programming experience in Python, Java or Golang Excellent data structure and algorithm skills Proven good development practices like automated testing, measuring code coverage. Proven experience developing complex Data Platforms and Solutions using Technologies like Kafka, Kubernetes, MySql, Hadoop, Big Query and other open source databases Experience designing and implementing large, fault-tolerant and distributed systems around columnar data stores. Excellent written and verbal communication skills Bonus points for contributions to the open source community Education BSCS or equivalent required, MSCS or equivalent strongly preferred #LI-SK3
Posted 4 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Azure Data Engineer Immediate starters required Hyderabad, TG & Noida, UP Timing : 5pm to 2am Should Have Expert with Azure Cloud Development infrastructure Expert with Designing Azure DataLake Storage (Gen2), Azure SQL Server, Azure Data Factory, Azure Logic App, Azure Analysis Services, Automation accounts, PowerShell, JSON and integrations with Azure resources. Expert in understanding Data warehouse and Data mart models to implement reporting layer designs for enterprise self-service reporting In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Expert with SQL queries, SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS) Expert with Azure Data Factory working on various datasources and targets Expert SQL, T-SQL and PL/SQL knowledge against a variety of databases like SQL Server, Oracle, Hyperion and Cache. Expert with ETL tools like Kettle or Data stage is a plus Experience with Hyperion and Kronos is a plus Expertise Oracle and EBS with a concentration on HR Should feel comfortable with relational design with high transaction databases Experience with Jira, Confluence, EasyVista or similar application preferred. Ability to adapt to new tools Must be able to work independently as well as with a team. Motivated self-directed, but with the ability to take direction from others. Excellent Verbal and Written communication skills.
Posted 4 days ago
10.0 - 14.0 years
15 - 22 Lacs
Gurugram
Work from Office
ZS Master Data Management Team has an extensive track record of completing over 1000 global projects and partnering with 15 of the top 20 Global Pharma organizations. They specialize in various MDM domains, offering end-to-end project implementation, change management, and data stewardship support. Their services encompass MDM strategy consulting, implementation for key entities (e.g., HCP, HCO, Employee, Payer, Product, Patient, Affiliations), and operational support including KTLO and Data Stewardship. With 50+ MDM implementations and Change Management programs annually for Life Sciences clients, the team has developed valuable assets like MDM libraries and pre-built accelerators. Strategic partnerships with leading platform vendors (Reltio, Informatica, Veeva, Semarchy etc) and collaborations with 18+ data vendors and technology providers further enhance their capabilities. You as Business Technology Solutions Manager will take ownership of one or more client delivery at a cross office level encompassing the area of digital experience transformation. The successful candidate will work closely with ZS Technology leadership and be responsible for building and managing client relationships, generating new business engagements, and providing thought leadership in the Digital Area. What Youll Do Lead the delivery process right from discovery/ POC to managing operations, across 3-4 client engagements helping to deliver world-class MDM solutions Ownership to ensure the proposed design/ architecture, deliverables meets the client expectation and solves the business problem with high degree of quality; Partner with Senior Leadership team and assist in project management responsibility i.e. Project planning, staffing management, people growth, etc.; Develop and implement master data management strategies and processes to maintain high-quality master data across the organization. Design and manage data governance frameworks, including data quality standards, policies, and procedures. Outlook for continuous improvement, innovation and provide necessary mentorship and guidance to the team; Liaison with Staffing partner, HR business partners for team building/ planning; Lead efforts for building POV on new technology or problem solving, Innovation to build firm intellectual capital: Actively lead unstructured problem solving to design and build complex solutions, tune to meet expected performance and functional requirements; Stay current with industry trends and emerging technologies in master data management and data governance. What Youll Bring: Bachelor's/Master's degree with specialization in Computer Science, MIS, IT or other computer related disciplines; 10-14 years of relevant consulting-industry experience (Preferably Healthcare bad Life Science) working on medium-large scale MDM solution delivery engagements: 5+ years of hands-on experience on designing, implementation MDM services capabilities using tools such as Informatica MDM, Reltio etc Strong understanding of data management principles, including data modeling, data quality, and metadata management. Strong understanding of various cloud based data management (ETL Tools) platforms such as AWS, Azure, Snowflake etc.,; Experience in designing and driving delivery of mid-large-scale solutions on Cloud platforms; Experience with ETL design and development, and (OLAP) tools to support business applications Additional Skills Ability to manage a virtual global team environment that contributes to the overall timely delivery of multiple projects; Knowledge of current data modeling, and data warehouse concepts, issues, practices, methodologies, and trends in the Business Intelligence domain; Experience with analyzing and troubleshooting the interaction between databases, operating systems, and applications; Significant supervisory, coaching and hands-on project management skills; Willingness to travel to other global offices as needed to work with client or other internal project teams.
Posted 4 days ago
4.0 years
3 - 5 Lacs
Gurgaon
Remote
Job description About this role What are Aladdin and Aladdin Engineering? You will be working on BlackRock's investment operating system called Aladdin , which is used both internally within BlackRock and externally by many financial institutions. Aladdin combines sophisticated risk analytics with comprehensive portfolio management, trading, and operations tools on a single platform. It powers informed decision-making and creates a connective tissue for thousands of users investing worldwide. Our development teams are part of Aladdin Engineering . We collaborate to build the next generation of technology that transforms the way information, people, and technology intersect for global investment firms. We build and package tools that manage trillions in assets and support millions of financial instruments. We perform risk calculations and process millions of transactions for thousands of users worldwide every day. Your Team: The Database Hosting Team is a key part of Platform Hosting Services , which operates under the broader Aladdin Engineering group. Hosting Services is responsible for managing the reliability, stability, and performance of the firm's financial systems, including Aladdin, and ensuring its availability to our business partners and customers. We are a globally distributed team, spanning multiple regions, providing engineering and operational support for online transaction processing, data warehousing, data replication, and distributed data processing platforms. Your Role and Impact: Data is the backbone of any world-class financial institution. The Database Operations Team ensures the resiliency and integrity of that data while providing instantaneous access to a large global user base at BlackRock and across many institutional clients. As specialists in database technology, our team is involved in every aspect of system design, implementation, tuning, and monitoring, using a wide variety of industry-leading database technologies. We also develop code to provide analysis, insights, and automate our solutions at scale. Although our specialty is database technology, to excel in our role, we must understand the environment in which our technology operates. This includes understanding the business needs, application server stack, and interactions between database software, operating systems, and host hardware to deliver the best possible service. We are passionate about performance and innovation. At every level of the firm, we embrace diversity and offer flexibility to enhance work-life balance. Your Responsibilities: The role involves providing operations, development, and project support within the global database environment across various platforms. Key responsibilities include: Operational Support for Database Technology: Engineering, administration, and operations of OLTP, OLAP, data warehousing platforms, and distributed No-SQL systems. Collaboration with infrastructure teams, application developers, and business teams across time zones to deliver high-quality service to Aladdin users. Automation and development of database operational, monitoring, and maintenance toolsets to achieve scalability and efficiency. Database configuration management, capacity and scale management, schema releases, consistency, security, disaster recovery, and audit management. Managing operational incidents, conducting root-cause analysis, resolving critical issues, and mitigating future risks. Assessing issues for severity, troubleshooting proactively, and ensuring timely resolution of critical system issues. Escalating outages when necessary, collaborating with Client Technical Services and other teams, and coordinating with external vendors for support. Project-Based Participation: Involvement in major upgrades and migration/consolidation exercises. Exploring and implementing new product features. Contributing to performance tuning and engineering activities. Contributing to Our Software Toolset: Enhancing monitoring and maintenance utilities in Perl, Python, and Java. Contributing to data captures to enable deeper system analysis. Qualifications: B.E./B.Tech/MCA or another relevant engineering degree from a reputable university. 4+ years of proven experience in Data Administration or a similar role. Skills and Experience: Enthusiasm for acquiring new technical skills. Effective communication with senior management from both IT and business areas. Understanding of large-scale enterprise application setups across data centers/cloud environments. Willingness to work weekends on DBA activities and shift hours. Experience with database platforms like SAP Sybase , Microsoft SQL Server , Apache Cassandra , Cosmos DB, PostgreSQL, and data warehouse platforms such as Snowflake , Greenplum. Exposure to public cloud platforms such as Microsoft Azure, AWS, and Google Cloud. Knowledge of programming languages like Python, Perl, Java, Go; automation tools such as Ansible/AWX; source control systems like GIT and Azure DevOps. Experience with operating systems like Linux and Windows. Strong background in supporting mission-critical applications and performing deep technical analysis. Flexibility to work with various technologies and write high-quality code. Exposure to project management. Passion for interactive troubleshooting, operational support, and innovation. Creativity and a drive to learn new technologies. Data-driven problem-solving skills and a desire to scale technology for future needs. Operating Systems: Familiarity with Linux/Windows. Proficiency with shell commands (grep, find, sed, awk, ls, cp, netstat, etc.). Experience checking system performance metrics like CPU, memory, and disk usage on Unix/Linux. Other Personal Characteristics: Integrity and the highest ethical standards. Ability to quickly adjust to complex data and information, displaying strong learning agility. Self-starter with a commitment to superior performance. Natural curiosity and a desire to always learn. If this excites you, we would love to discuss your potential role on our team! Our benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law. Job Requisition # R255448
Posted 4 days ago
8.0 - 13.0 years
25 - 40 Lacs
Mumbai, Hyderabad
Work from Office
Essential Services: Role & Location fungibility At ICICI Bank, we believe in serving our customers beyond our role definition, product boundaries, and domain limitations through our philosophy of customer 360-degree. In essence, this captures our belief in serving the entire banking needs of our customers as One Bank, One Team . To achieve this, employees at ICICI Bank are expected to be role and location-fungible with the understanding that Banking is an essential service . The role descriptions give you an overview of the responsibilities, it is only directional and guiding in nature. About the Role: As a Data Warehouse Architect, you will be responsible for managing and enhancing data warehouse that manages large volume of customer-life cycle data flowing in from various applications within guardrails of risk and compliance. You will be managing the day-to-day operations of data warehouse i.e. Vertica. In this role responsibility, you will manage a team of data warehouse engineers to develop data modelling, designing ETL data pipeline, issue management, upgrades, performance fine-tuning, migration, governance and security framework of the data warehouse. This role enables the Bank to maintain huge data sets in a structured manner that is amenable for data intelligence. The data warehouse supports numerous information systems used by various business groups to derive insights. As a natural progression, the data warehouses will be gradually migrated to Data Lake enabling better analytical advantage. The role holder will also be responsible for guiding the team towards this migration. Key Responsibilities: Data Pipeline Design: Responsible for designing and developing ETL data pipelines that can help in organising large volumes of data. Use of data warehousing technologies to ensure that the data warehouse is efficient, scalable, and secure. Issue Management: Responsible for ensuring that the data warehouse is running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance. Collaboration: Collaborate with cross-functional teams to implement upgrades, migrations and continuous improvements. Data Integration and Processing: Responsible for processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent. Data Modelling: Responsible for designing and implementing data modelling solutions to ensure that the organizations data is properly structured and organized for analysis. Key Qualifications & Skills: Education Qualification: B.E./B. Tech. in Computer Science, Information Technology or equivalent domain with 10 to 12 years of experience and at least 5 years or relevant work experience in Datawarehouse/ mining/BI/MIS. Experience in Data Warehousing: Knowledge on ETL and data technologies and outline future vision in OLTP, OLAP (Oracle / MS SQL). Data Modelling, Data Analysis and Visualization experience (Analytical tools experience like Power BI / SAS / ClickView / Tableu etc). Good to have exposure to Azure Cloud Data platform services like COSMOS, Azure Data Lake, Azure Synapse, and Azure Data factory. Synergize with the Team: Regular interaction with business/product/functional teams to create mobility solutions. Certification: Azure certified DP 900, PL 300, DP 203 or any other Data platform/Data Analyst certifications. About the Business Group The Technology Group at ICICI Bank is at the forefront of our operations and offerings, which are focused on leveraging state-of-the-art technology to provide customer-centric solutions. This group plays a pivotal role in our vision of the transition from Bank to Bank Tech. Further, the group offers round-the-clock support to our entire banking ecosystem. In our persistent efforts to provide products and solutions that genuinely touch customers, unlocking the potential of technology in every single engagement would go a long way in creating customer delight. In this endeavor, we also tirelessly ensure all our processes, systems, and infrastructure are very well within the guardrails of data security, privacy, and relevant regulations.
Posted 4 days ago
1.0 years
0 Lacs
Tamil Nadu, India
Remote
Role : Onestream Developer Type : 1 Year Contract (Freelance/Contractor/Consultant) Location : Remote Experience : 3-6 years Shift Timing: 4PM to 1AM IST Requirements: Strong proficiency in VB.NET, SQL, OneStream Studio, Cube Views, and Workflow configuration. Understanding of OLAP concepts, financial consolidation, intercompany eliminations, currency translation, and financial planning logic. Experience with data integrations, including scripting and automation of ETL processes. Preferred Qualifications (Nice to Have) OneStream certification (e.g., OneStream Certified Professional). Experience with BI tools (Power BI, Tableau). Familiarity with Agile or hybrid project delivery methodologies. Background in finance or accounting is a strong plus.
Posted 4 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About the Company We at Saxon AI are looking for candidates with Strong expertise on Infrastructure along with Data Engineer. About the Role Exp - 4+yrs Location - Hyderabad, Pune, Noida Job Description: Expert with Azure Cloud Development infrastructure Expert with Designing Azure DataLake Storage (Gen2), Azure SQL Server, Azure Data Factory, Azure Logic App, Azure Analysis Services, Automation accounts, PowerShell, JSON and integrations with Azure resources. Expert in understanding Data warehouse and Data mart models to implement reporting layer designs for enterprise self-service reporting In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework Expert with SQL queries, SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS) Expert with Azure Data Factory working on various datasources and targets Expert SQL, T-SQL and PL/SQL knowledge against a variety of databases like SQL Server, Oracle, Hyperion and Cache. Expert with ETL tools like Kettle or Data stage is a plus Experience with Hyperion and Kronos is a plus Expertise Oracle and EBS with a concentration on HR Should feel comfortable with relational design with high transaction databases Preferred Skills Experience with Jira, Confluence, Easy Vista or similar application preferred. Ability to adapt to new tools Must be able to work independently as well as with a team. Motivated self-directed, but with the ability to take direction from others. Excellent Verbal and Written communication skills.
Posted 4 days ago
3.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior - Palantir Job Overview Big Data Developer/Senior Data Engineer with 3 to 6+ years of experience who would display strong analytical, problem-solving, programming, Business KPIs understanding and communication skills. They should be self-learner, detail-oriented team members who can consistently meet deadlines and possess the ability to work independently as needed. He/she must be able to multi-task and demonstrate the ability to work with a diverse work group of stakeholders for healthcare/Life Science/Pharmaceutical domains. Responsibilities And Duties Technical - Design, develop, and maintain data models, integrations, and workflows within Palantir Foundry. Detailed understanding and Hands-on knowledge of Palantir Solutions (e.g., Usecare, DTI, Code Repository, Pipeline Builder etc.) Analysing data within Palantir to extract insights for easy interpretation and Exploratory Data Analysis (e.g., Contour). Querying and Programming Skills: Utilizing programming languages query or scripts (e.g., Python, SQL) to interact with the data and perform analyses. Understanding relational data structures and data modelling to optimize data storage and retrieval based on OLAP engine principles. Distributed Frameworks with Automation using Spark APIs (e.g., PySpark, Spark SQL, RDD/DF) to automate processes and workflows within Palantir with external libraries (e.g., Pandas, NumPy etc.). API Integration: Integrating Palantir with other systems and applications using APIs for seamless data flow. Understanding of integration analysis, specification, and solution design based on different scenarios (e.g., Batch/Realtime Flow, Incremental Load etc.). Optimize data pipelines and finetune Foundry configurations to enhance system performance and efficiency. Unit Testing, Issues Identification, Debugging & Trouble shooting, End user documentation. Strong experience on Data Warehousing, Data Engineering, and Data Modelling problem statements. Knowledge of security related principles by ensuring data privacy and security while working with sensitive information. Familiarity with integrating machine learning and AI capabilities within the Palantir environment for advanced analytics. Non-Technical Collaborate with stakeholders to identify opportunities for continuous improvement, understanding business need and innovation in data processes and solutions. Ensure compliance with policies for data privacy, security, and regulatory requirements. Provide training and support to end-users to maximize the effective use of Palantir Foundry. Self-driven learning of technologies being adopted by the organizational requirements. Work as part of a team or individuals as engineer in a highly collaborative fashion EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 days ago
3.0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior - Palantir Job Overview Big Data Developer/Senior Data Engineer with 3 to 6+ years of experience who would display strong analytical, problem-solving, programming, Business KPIs understanding and communication skills. They should be self-learner, detail-oriented team members who can consistently meet deadlines and possess the ability to work independently as needed. He/she must be able to multi-task and demonstrate the ability to work with a diverse work group of stakeholders for healthcare/Life Science/Pharmaceutical domains. Responsibilities And Duties Technical - Design, develop, and maintain data models, integrations, and workflows within Palantir Foundry. Detailed understanding and Hands-on knowledge of Palantir Solutions (e.g., Usecare, DTI, Code Repository, Pipeline Builder etc.) Analysing data within Palantir to extract insights for easy interpretation and Exploratory Data Analysis (e.g., Contour). Querying and Programming Skills: Utilizing programming languages query or scripts (e.g., Python, SQL) to interact with the data and perform analyses. Understanding relational data structures and data modelling to optimize data storage and retrieval based on OLAP engine principles. Distributed Frameworks with Automation using Spark APIs (e.g., PySpark, Spark SQL, RDD/DF) to automate processes and workflows within Palantir with external libraries (e.g., Pandas, NumPy etc.). API Integration: Integrating Palantir with other systems and applications using APIs for seamless data flow. Understanding of integration analysis, specification, and solution design based on different scenarios (e.g., Batch/Realtime Flow, Incremental Load etc.). Optimize data pipelines and finetune Foundry configurations to enhance system performance and efficiency. Unit Testing, Issues Identification, Debugging & Trouble shooting, End user documentation. Strong experience on Data Warehousing, Data Engineering, and Data Modelling problem statements. Knowledge of security related principles by ensuring data privacy and security while working with sensitive information. Familiarity with integrating machine learning and AI capabilities within the Palantir environment for advanced analytics. Non-Technical Collaborate with stakeholders to identify opportunities for continuous improvement, understanding business need and innovation in data processes and solutions. Ensure compliance with policies for data privacy, security, and regulatory requirements. Provide training and support to end-users to maximize the effective use of Palantir Foundry. Self-driven learning of technologies being adopted by the organizational requirements. Work as part of a team or individuals as engineer in a highly collaborative fashion EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 days ago
3.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior - Palantir Job Overview Big Data Developer/Senior Data Engineer with 3 to 6+ years of experience who would display strong analytical, problem-solving, programming, Business KPIs understanding and communication skills. They should be self-learner, detail-oriented team members who can consistently meet deadlines and possess the ability to work independently as needed. He/she must be able to multi-task and demonstrate the ability to work with a diverse work group of stakeholders for healthcare/Life Science/Pharmaceutical domains. Responsibilities And Duties Technical - Design, develop, and maintain data models, integrations, and workflows within Palantir Foundry. Detailed understanding and Hands-on knowledge of Palantir Solutions (e.g., Usecare, DTI, Code Repository, Pipeline Builder etc.) Analysing data within Palantir to extract insights for easy interpretation and Exploratory Data Analysis (e.g., Contour). Querying and Programming Skills: Utilizing programming languages query or scripts (e.g., Python, SQL) to interact with the data and perform analyses. Understanding relational data structures and data modelling to optimize data storage and retrieval based on OLAP engine principles. Distributed Frameworks with Automation using Spark APIs (e.g., PySpark, Spark SQL, RDD/DF) to automate processes and workflows within Palantir with external libraries (e.g., Pandas, NumPy etc.). API Integration: Integrating Palantir with other systems and applications using APIs for seamless data flow. Understanding of integration analysis, specification, and solution design based on different scenarios (e.g., Batch/Realtime Flow, Incremental Load etc.). Optimize data pipelines and finetune Foundry configurations to enhance system performance and efficiency. Unit Testing, Issues Identification, Debugging & Trouble shooting, End user documentation. Strong experience on Data Warehousing, Data Engineering, and Data Modelling problem statements. Knowledge of security related principles by ensuring data privacy and security while working with sensitive information. Familiarity with integrating machine learning and AI capabilities within the Palantir environment for advanced analytics. Non-Technical Collaborate with stakeholders to identify opportunities for continuous improvement, understanding business need and innovation in data processes and solutions. Ensure compliance with policies for data privacy, security, and regulatory requirements. Provide training and support to end-users to maximize the effective use of Palantir Foundry. Self-driven learning of technologies being adopted by the organizational requirements. Work as part of a team or individuals as engineer in a highly collaborative fashion EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 days ago
16.0 - 20.0 years
0 Lacs
karnataka
On-site
As a Principle BI Architect, your primary responsibility will be to develop solution blueprints and architecture specifications for evolving business processes, specifically focusing on business planning and goal setting. You will collaborate with other architects to explore specific solutions and define project scopes. Serving as a liaison to the business community, you will ensure that their needs are fully understood by the project team. Additionally, you will review prototypes, solution blueprints, and project scopes to ensure alignment with business requirements. Your role will involve raising awareness of solutions and assisting the change management team in developing strategies to improve user adoption. You will work closely with end-users to identify opportunities for enhanced data management and delivery. Collaboration with development and testing teams is essential to ensure proper implementation and streamline processes related to data flow and quality. Staying current with industry trends, you will advise and educate management on their importance and impact. Leading and aligning multiple BI Engineers across various use cases towards standards and policies will be part of your responsibilities. You must be highly experienced in defining a common semantic layer, utilizing search capabilities, and being the design authority familiar with information consumption patterns. In terms of qualifications, you should possess a balanced mix of technical and business skills with at least 16 years of experience. Proficiency in data analysis, visualizations, problem-solving, and strategic direction is crucial. You should be well-versed in BI technologies such as PowerBI, Thoughtspot, and proficient in defining and verifying OLAP strategies and frameworks. Strong communication and interpersonal skills are essential, along with the ability to instill a strong customer service and business-oriented ethic in the team. Balancing data needs with action requirements, demonstrating strong management and leadership skills, and maintaining a proactive participative management style are key attributes for this role. Having expertise in enterprise data management, requirements planning, formulation, and documentation would be a plus. Knowledge of BioPharma Manufacturing and Supply Chain is also beneficial. The tech stack for this role includes PowerBI, Thoughtspot, Azure Cloud, Databricks, SQL, PL/SQL, Azure Analysis Services, Azure Synapse, Data Lakes, Lakehouses, Semantic Layer tools, Data virtualization tools, Power Automate, and PowerApps.,
Posted 4 days ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior - Palantir Job Overview Big Data Developer/Senior Data Engineer with 3 to 6+ years of experience who would display strong analytical, problem-solving, programming, Business KPIs understanding and communication skills. They should be self-learner, detail-oriented team members who can consistently meet deadlines and possess the ability to work independently as needed. He/she must be able to multi-task and demonstrate the ability to work with a diverse work group of stakeholders for healthcare/Life Science/Pharmaceutical domains. Responsibilities And Duties Technical - Design, develop, and maintain data models, integrations, and workflows within Palantir Foundry. Detailed understanding and Hands-on knowledge of Palantir Solutions (e.g., Usecare, DTI, Code Repository, Pipeline Builder etc.) Analysing data within Palantir to extract insights for easy interpretation and Exploratory Data Analysis (e.g., Contour). Querying and Programming Skills: Utilizing programming languages query or scripts (e.g., Python, SQL) to interact with the data and perform analyses. Understanding relational data structures and data modelling to optimize data storage and retrieval based on OLAP engine principles. Distributed Frameworks with Automation using Spark APIs (e.g., PySpark, Spark SQL, RDD/DF) to automate processes and workflows within Palantir with external libraries (e.g., Pandas, NumPy etc.). API Integration: Integrating Palantir with other systems and applications using APIs for seamless data flow. Understanding of integration analysis, specification, and solution design based on different scenarios (e.g., Batch/Realtime Flow, Incremental Load etc.). Optimize data pipelines and finetune Foundry configurations to enhance system performance and efficiency. Unit Testing, Issues Identification, Debugging & Trouble shooting, End user documentation. Strong experience on Data Warehousing, Data Engineering, and Data Modelling problem statements. Knowledge of security related principles by ensuring data privacy and security while working with sensitive information. Familiarity with integrating machine learning and AI capabilities within the Palantir environment for advanced analytics. Non-Technical Collaborate with stakeholders to identify opportunities for continuous improvement, understanding business need and innovation in data processes and solutions. Ensure compliance with policies for data privacy, security, and regulatory requirements. Provide training and support to end-users to maximize the effective use of Palantir Foundry. Self-driven learning of technologies being adopted by the organizational requirements. Work as part of a team or individuals as engineer in a highly collaborative fashion EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 days ago
3.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) –Senior - Palantir Job Overview Big Data Developer/Senior Data Engineer with 3 to 6+ years of experience who would display strong analytical, problem-solving, programming, Business KPIs understanding and communication skills. They should be self-learner, detail-oriented team members who can consistently meet deadlines and possess the ability to work independently as needed. He/she must be able to multi-task and demonstrate the ability to work with a diverse work group of stakeholders for healthcare/Life Science/Pharmaceutical domains. Responsibilities And Duties Technical - Design, develop, and maintain data models, integrations, and workflows within Palantir Foundry. Detailed understanding and Hands-on knowledge of Palantir Solutions (e.g., Usecare, DTI, Code Repository, Pipeline Builder etc.) Analysing data within Palantir to extract insights for easy interpretation and Exploratory Data Analysis (e.g., Contour). Querying and Programming Skills: Utilizing programming languages query or scripts (e.g., Python, SQL) to interact with the data and perform analyses. Understanding relational data structures and data modelling to optimize data storage and retrieval based on OLAP engine principles. Distributed Frameworks with Automation using Spark APIs (e.g., PySpark, Spark SQL, RDD/DF) to automate processes and workflows within Palantir with external libraries (e.g., Pandas, NumPy etc.). API Integration: Integrating Palantir with other systems and applications using APIs for seamless data flow. Understanding of integration analysis, specification, and solution design based on different scenarios (e.g., Batch/Realtime Flow, Incremental Load etc.). Optimize data pipelines and finetune Foundry configurations to enhance system performance and efficiency. Unit Testing, Issues Identification, Debugging & Trouble shooting, End user documentation. Strong experience on Data Warehousing, Data Engineering, and Data Modelling problem statements. Knowledge of security related principles by ensuring data privacy and security while working with sensitive information. Familiarity with integrating machine learning and AI capabilities within the Palantir environment for advanced analytics. Non-Technical Collaborate with stakeholders to identify opportunities for continuous improvement, understanding business need and innovation in data processes and solutions. Ensure compliance with policies for data privacy, security, and regulatory requirements. Provide training and support to end-users to maximize the effective use of Palantir Foundry. Self-driven learning of technologies being adopted by the organizational requirements. Work as part of a team or individuals as engineer in a highly collaborative fashion EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 4 days ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for defining and driving the Platform engineering of Business intelligence solutions, with a specific focus on Power BI technology. Your role as a Sr Specialist in Visualization & Automation will be based in Hyderabad, India. Your key responsibilities will include, but not limited to: - Demonstrating strong Power BI skills to oversee the creation and management of BI and analytics solutions. - Ensuring the success of technology usage for solution delivery, best practices and standards definition, compliance, smooth transition to operations, improvements, and enablement of the business. - Supporting Solution delivery lead & visualization lead on existing, new, and upcoming features, technology decisioning, and roadmap. - Collaborating with solution architect and platform architect on visualization architecture pattern based on functional and non-functional requirements. - Defining & Driving DevOps Roadmap for enabling Agile ways of working, CI/CD pipeline, Automation for self-serve governance of Power BI platform. - Ensuring adherence to security and compliance policies and procedures. In order to be successful in this role, you should possess: - 8-10 years of IT experience in Data and Analytics, Visualization area with a strong exposure to Power BI Solution delivery and Platform Automation. - In-depth understanding of database management systems, ETL, OLAP, data lake technologies. - Experience in Power BI, with knowledge of other visualization technologies considered a plus. - Pharma domain specialization and understanding of data usage across the end-to-end enterprise value chain. - Excellent interpersonal, written, and verbal communication skills, aligned with Novartis Values & Behaviors. - Experience in managing vendor and customer expectations, driving techno-functional discussions, and understanding project financial processes. At Novartis, we are committed to building an outstanding, inclusive work environment and diverse teams that represent the patients and communities we serve. By joining us, you will be part of a mission to reimagine medicine to improve and extend people's lives. We believe that our associates are crucial in driving us towards our ambitions. If you are looking to be part of a community that strives to make a difference, we invite you to consider joining our Novartis family. Novartis is dedicated to creating a workplace that values diversity and inclusion, where every individual feels respected and empowered. If you are not suitable for this specific role but are interested in staying connected with Novartis for future career opportunities, you can join our Novartis Network. In conclusion, at Novartis, we are committed to creating a brighter future together by collaborating, supporting, and inspiring each other. If you are passionate about making a positive impact on patients" lives through innovative science and community support, we encourage you to explore career opportunities with us. Please refer to the official Novartis website to learn more about our commitment to diversity, benefits, rewards, and how you can be a part of our mission to become the most valued and trusted medicines company in the world.,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Engineer (Power BI) at Acronotics Limited, you will play a crucial role in designing and managing data pipelines that integrate Power BI, OLAP cubes, documents such as PDFs and presentations, and external data sources with Azure AI. Your primary responsibility will be to ensure that both structured and unstructured financial data is properly indexed and made accessible for semantic search and LLM application. Your key responsibilities in this full-time, on-site role based in Bengaluru will include extracting data from Power BI datasets, semantic models, and OLAP cubes. You will connect and transform data using Azure Synapse, Data Factory, and Lakehouse architecture. Additionally, you will preprocess PDFs, PPTs, and Excel files utilizing tools like Azure Form Recognizer or Python-based solutions. Your role will also involve designing data ingestion pipelines for external web sources, such as commodity prices, and collaborating with AI engineers to provide cleaned and contextual data for vector indexes. To be successful in this role, you should have a strong background in utilizing Power BI REST/XMLA APIs and expertise in OLAP systems like SSAS and SAP BW, data modeling, and ETL design. Hands-on experience with Azure Data Factory, Synapse, or Data Lake is essential, along with familiarity with JSON, DAX, and M queries. Join Acronotics Limited in revolutionizing businesses with cutting-edge robotic automation and artificial intelligence solutions. Let your expertise in data engineering contribute to the advancement of automated solutions that redefine how products are manufactured, marketed, and consumed. Discover the possibilities with Radium AI, our innovative product automating bot monitoring and support activities, on our website today.,
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
nagpur, maharashtra
On-site
As an ETL Developer with 4 to 8 years of experience, you will be responsible for hands-on ETL development using the Talend tool. You should have a high level of proficiency in writing complex yet efficient SQL queries. Your role will involve working extensively on PL/SQL Packages, Procedures, Functions, Triggers, Views, MViews, External tables, partitions, and Exception handling for retrieving, manipulating, checking, and migrating complex data sets in Oracle. In this position, it is essential to have experience in Data Modeling and Warehousing concepts such as Star Schema, OLAP, OLTP, Snowflake schema, Fact Tables for Measurements, and Dimension Tables. Additionally, familiarity with UNIX Scripting, Python/Spark, and Big Data Concepts will be beneficial. If you are a detail-oriented individual with strong expertise in ETL development, SQL, PL/SQL, data modeling, and warehousing concepts, this role offers an exciting opportunity to work with cutting-edge technologies and contribute to the success of the organization.,
Posted 4 days ago
11.0 - 20.0 years
12 - 16 Lacs
Pune
Hybrid
Roles & Responsibilities: The Senior Tech Lead specializing in Analytics and Visualization leads the design, development, and optimization of data visualization solutions Lead the design and implementation of analytics and visualization solutions using tools like Power BI and Tableau Architect and optimize dashboards and reports for performance, scalability, and user experience Provide technical leadership and mentorship to a team of data analysts and visualization experts Collaborate with stakeholders to define project requirements and ensure alignment with business goals Ensure best practices in data visualization, governance, and security. Troubleshoot and resolve complex technical issues in analytics and visualization environments Strong Experience on Presales, RFP responses, customer connects Requirements: Strong Experience in data preparation, BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful insights in Power BI Design, develop, test, and deploy reports in Power BI Strong exposure to Visualization, transformation and data analysis Connecting to data sources, importing data and transforming data for Business Intelligence. Good Exposure to DAX queries in Power BI desktop. Experience in publishing and scheduling Power Bi reports /Tableau Should have knowledge and experience in prototyping, designing, and requirement analysis. Sound knowledge of database management, SQL querying, data warehousing, business intelligence, and OLAP(Online Analytical Processing) Must have Solutioning and presales experience. Primary Skills Power BI and Secondary Skill - Tableau Total Experience Range: - 11 - 20 Yrs Our Offering: Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture.
Posted 5 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About the Role We are seeking a Senior Backend Engineer with expertise in Java, Spring Boot, API design, security, and data infrastructure . This role combines backend development with strong data engineering skills , including ETL pipelines, search/indexing systems (Elastic or similar), reporting flows, and scaling PostgreSQL databases . You will be responsible for designing and implementing secure, high-performance backend services while also building data capabilities that support analytics and business insights. Key Responsibilities Backend Services & Architecture Design, develop, and maintain backend services and APIs using Java and Spring Boot . Architect solutions for scalability, performance, and reliability in a microservices/cloud environment. Data Infrastructure & ETL Design and implement ETL pipelines to ingest, transform, and serve data for analytics and reporting. Work on setting up and managing Elasticsearch (or OpenSearch) clusters for search and analytics . Build reporting and data flow pipelines that integrate transactional and analytical data. Database Performance & Scaling Optimize PostgreSQL schemas, queries, and indexes for high-performance data access. Plan for horizontal and vertical scaling, partitioning, and caching strategies for large data volumes. Monitor and resolve database bottlenecks. API Design & Data Access Build robust, secure, and versioned REST APIs (GraphQL experience is a plus). Ensure proper data governance, security, and access control in all backend services. Security & Best Practices Implement strong security practices (Spring Security, OAuth2, JWT). Enforce best practices for code quality, CI/CD, and cloud-native deployments. Collaboration & Mentorship Partner with product managers, frontend engineers, and data analysts. Mentor junior developers and participate in architecture reviews. Required Skills and Qualifications Core Backend Skills 5+ years of experience in backend development with Java and Spring Boot . Strong understanding of object-oriented programming, design patterns, and microservices . Data Engineering / Infrastructure Expertise Hands-on experience building ETL pipelines for reporting and analytics. Experience with Elasticsearch / OpenSearch or similar indexing/search systems . Expertise in PostgreSQL performance tuning, indexing, partitioning, and scaling strategies . API Design & Cloud Proficiency in RESTful API design ; GraphQL experience preferred. Familiarity with containerized deployments (Docker, Kubernetes) and CI/CD. Security & Performance Experience with Spring Security, OAuth2, SSO . Knowledge of profiling, monitoring, and optimizing backend systems. Preferred Qualifications Knowledge of distributed data processing systems (Kafka, Spark, Airflow) . Experience with data warehouses, OLAP tools, or BI/reporting solutions . Exposure to cloud-native data services (AWS RDS, Aurora, OpenSearch, etc.) . Education Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience). What We Offer Opportunity to work on mission-critical backend systems and data infrastructure . Competitive salary and comprehensive benefits package. Collaborative and innovative work environment with modern tools and processes.
Posted 5 days ago
2.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Associate Data Engineer What You Will Do Let’s do this. Let’s change the world. In this vital role we seek a skilled Data Engineer to build and optimize our data infrastructure. As a key contributor, you will collaborate closely with cross-functional teams to design and implement robust data pipelines that efficiently extract, transform, and load data into our AWS-based data lake and data warehouse. Your expertise will be instrumental in empowering data-driven decision making through advanced analytics and predictive modeling. Roles & Responsibilities: Building and optimizing data pipelines, data warehouses, and data lakes on the AWS and Databricks platforms. Managing and maintaining the AWS and Databricks environments. Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring. Maintain system uptime and optimal performance Working closely with cross-functional teams to understand business requirements and translate them into technical solutions. Exploring and implementing new tools and technologies to enhance ETL platform performance. What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Bachelor’s degree and 2 to 6 years. Functional Skills: Must-Have Skills: Proficient in SQL for extracting, transforming, and analyzing complex datasets from both relational and columnar data stores. Proven ability to optimize query performance on big data platforms. Proficient in leveraging Python, PySpark, and Airflow to build scalable and efficient data ingestion, transformation, and loading processes. Ability to learn new technologies quickly. Strong problem-solving and analytical skills. Excellent communication and teamwork skills. Good-to-Have Skills: Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
12.0 - 17.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Disease we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Principal Data Engineer What you will do Let s do this. Let s change the world. Role Description: We are seeking a seasoned Principal Data Engineer to lead the design, development, and implementation of our data strategy. The ideal candidate possesses a deep understanding of data engineering principles, coupled with strong leadership and problem-solving skills. As a Principal Data Engineer, you will architect and oversee the development of robust data platforms, while mentoring and guiding a team of data engineers. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, demonstrating AWS or other preferred platforms. Lead and motivate a strong data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and 12 to 17 years of experience in Computer Science, IT or related field of experience Demonstrated proficiency in leveraging cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop strong data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc. ), CI/CD (Jenkins, Maven etc. ), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .
Posted 5 days ago
4.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
About This Role What are Aladdin and Aladdin Engineering? You will be working on BlackRock's investment operating system called Aladdin, which is used both internally within BlackRock and externally by many financial institutions. Aladdin combines sophisticated risk analytics with comprehensive portfolio management, trading, and operations tools on a single platform. It powers informed decision-making and creates a connective tissue for thousands of users investing worldwide. Our development teams are part of Aladdin Engineering. We collaborate to build the next generation of technology that transforms the way information, people, and technology intersect for global investment firms. We build and package tools that manage trillions in assets and support millions of financial instruments. We perform risk calculations and process millions of transactions for thousands of users worldwide every day. Your Team The Database Hosting Team is a key part of Platform Hosting Services , which operates under the broader Aladdin Engineering group. Hosting Services is responsible for managing the reliability, stability, and performance of the firm's financial systems, including Aladdin, and ensuring its availability to our business partners and customers. We are a globally distributed team, spanning multiple regions, providing engineering and operational support for online transaction processing, data warehousing, data replication, and distributed data processing platforms. Your Role And Impact Data is the backbone of any world-class financial institution. The Database Operations Team ensures the resiliency and integrity of that data while providing instantaneous access to a large global user base at BlackRock and across many institutional clients. As specialists in database technology, our team is involved in every aspect of system design, implementation, tuning, and monitoring, using a wide variety of industry-leading database technologies. We also develop code to provide analysis, insights, and automate our solutions at scale. Although our specialty is database technology, to excel in our role, we must understand the environment in which our technology operates. This includes understanding the business needs, application server stack, and interactions between database software, operating systems, and host hardware to deliver the best possible service. We are passionate about performance and innovation. At every level of the firm, we embrace diversity and offer flexibility to enhance work-life balance. Your Responsibilities The role involves providing operations, development, and project support within the global database environment across various platforms. Key responsibilities include: Operational Support for Database Technology: Engineering, administration, and operations of OLTP, OLAP, data warehousing platforms, and distributed No-SQL systems. Collaboration with infrastructure teams, application developers, and business teams across time zones to deliver high-quality service to Aladdin users. Automation and development of database operational, monitoring, and maintenance toolsets to achieve scalability and efficiency. Database configuration management, capacity and scale management, schema releases, consistency, security, disaster recovery, and audit management. Managing operational incidents, conducting root-cause analysis, resolving critical issues, and mitigating future risks. Assessing issues for severity, troubleshooting proactively, and ensuring timely resolution of critical system issues. Escalating outages when necessary, collaborating with Client Technical Services and other teams, and coordinating with external vendors for support. Project-Based Participation: Involvement in major upgrades and migration/consolidation exercises. Exploring and implementing new product features. Contributing to performance tuning and engineering activities. Contributing to Our Software Toolset: Enhancing monitoring and maintenance utilities in Perl, Python, and Java. Contributing to data captures to enable deeper system analysis. Qualifications B.E./B.Tech/MCA or another relevant engineering degree from a reputable university. 4+ years of proven experience in Data Administration or a similar role. Skills And Experience Enthusiasm for acquiring new technical skills. Effective communication with senior management from both IT and business areas. Understanding of large-scale enterprise application setups across data centers/cloud environments. Willingness to work weekends on DBA activities and shift hours. Experience with database platforms like SAP Sybase, Microsoft SQL Server, Apache Cassandra, Cosmos DB, PostgreSQL, and data warehouse platforms such as Snowflake, Greenplum. Exposure to public cloud platforms such as Microsoft Azure, AWS, and Google Cloud. Knowledge of programming languages like Python, Perl, Java, Go; automation tools such as Ansible/AWX; source control systems like GIT and Azure DevOps. Experience with operating systems like Linux and Windows. Strong background in supporting mission-critical applications and performing deep technical analysis. Flexibility to work with various technologies and write high-quality code. Exposure to project management. Passion for interactive troubleshooting, operational support, and innovation. Creativity and a drive to learn new technologies. Data-driven problem-solving skills and a desire to scale technology for future needs. Operating Systems: Familiarity with Linux/Windows. Proficiency with shell commands (grep, find, sed, awk, ls, cp, netstat, etc.). Experience checking system performance metrics like CPU, memory, and disk usage on Unix/Linux. Other Personal Characteristics Integrity and the highest ethical standards. Ability to quickly adjust to complex data and information, displaying strong learning agility. Self-starter with a commitment to superior performance. Natural curiosity and a desire to always learn. If this excites you, we would love to discuss your potential role on our team! Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Posted 5 days ago
2.0 - 6.0 years
7 - 8 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Associate Data Engineer What you will do Let’s do this. Let’s change the world. In this vital role we seek a skilled Data Engineer to build and optimize our data infrastructure. As a key contributor, you will collaborate closely with cross-functional teams to design and implement robust data pipelines that efficiently extract, transform, and load data into our AWS-based data lake and data warehouse. Your expertise will be instrumental in empowering data-driven decision making through advanced analytics and predictive modeling. Roles & Responsibilities: Building and optimizing data pipelines, data warehouses, and data lakes on the AWS and Databricks platforms. Managing and maintaining the AWS and Databricks environments. Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring. Maintain system uptime and optimal performance Working closely with cross-functional teams to understand business requirements and translate them into technical solutions. Exploring and implementing new tools and technologies to enhance ETL platform performance. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Bachelor’s degree and 2 to 6 years. Functional Skills: Must-Have Skills: Proficient in SQL for extracting, transforming, and analyzing complex datasets from both relational and columnar data stores. Proven ability to optimize query performance on big data platforms. Proficient in leveraging Python, PySpark, and Airflow to build scalable and efficient data ingestion, transformation, and loading processes. Ability to learn new technologies quickly. Strong problem-solving and analytical skills. Excellent communication and teamwork skills. Good-to-Have Skills: Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough