Jobs
Interviews

69 Azure Blob Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

9 - 14 Lacs

noida, bengaluru

Work from Office

Overview: As a Backend Developer/ Architect, you need to participate in estimating, technical design, implementation, documentation, testing, deployment and support of applications developed for our clients. As a member working in a team environment, you will work with solution architects to interpret/translate written business requirements into technical design/code. Scope: Core responsibilities to include building backend Rest API Services based on Spring Boot deployed in a SaaS environment The dev team currently comprises of 10+ global associates across US and India (COE) Our current technical environment: Software: Spring Boot Microservices, Building Portal component, Azure SQL, Spock groovy Application Architecture: Service deployed on Azure Frameworks/Others: KAFKA, GitHub, CI/CD, Java, J2EE, Docker, Kubernetes Experience on SaaS What youll do: Development of REST API in a Microservices architecture (Spring Boot) and deployed on Microsofts Azure platform. The architecture includes technology components such as ReactJS and JavaScript/Typescript (UI), Spring Boot (Backend), Azure SQL, Azure Blob, Azure Logic Apps, Portal and Supply Chain planning software Be a senior member of a highly skilled team seeking systematic approaches to improve engineering productivity, efficiency, effectiveness, and quality Support our existing customer base with the newer enhancements/ defects fixing Create technical documentation Provide early visibility and mitigation to technical challenges through the journey. Confidently represents product and portfolio What we are looking for: Bachelors degree (STEM preferred) and minimum 8+ years of experience in Software development; ideally a candidate that has started as a Software Engineer and progressed to Lead Software Engineer Strong experience in programming and problem solving Hands-on development skills along with design experience; should not have moved away from software development Experience in building products with an API first approach in a SaaS environment Required Skills: Java, Spring Boot, SQL Preferred Skills: Knowledge of Public Clouds (Azure, AWS etc.), Spring Cloud, Docker, Kubernetes Experience in Supply Chain domain is a plus Good Understanding of secure architectures, secure configuration, identity management, role-based access control, authentication & authorization, and data encryption.

Posted 4 days ago

Apply

12.0 - 14.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Location: Indore, Pune, Bangalore, Noida & Gurgaon (Immediate Joiners Preferred) Overall 12+ years of experience. Experience and ability to lead end to end delivery and produce results - requirement gathering, work with cross functional teams, with offshore/onshore, delivery planning. Excellent in communication, presentation, stakeholder management, tracking, delivery. Ability to coordinate with onshore and offshore cross-functional teams based out of India and US time zones, to deliver concurrent projects. Experience in SDLC processes like Agile - Scrum/Kanban, Waterfall. Experience in working with complex client ecosystem on large projects. Experience in data engineering as a Lead/Architect. Experience in SQL, Python, PySpark, Azure Data Services (like Azure Data Factory, Azure Blob, Azure Data Lake, Key Vault, etc.), Databricks, Data Warehousing/modelling. Quick learner, self starter, motivated, result oriented and ability to work with ownership. Good to have experience in Databricks including Medallion Architecture, DLT, Unity Catalog. Good to have experience on Snowflake. Understanding of ETL (tools like Matillion, DataStage, etc) & data warehousing concepts. Integration experience with data sources like REST webservices, Oracle, SAP HANA, Salesforce, etc. Show more Show less

Posted 6 days ago

Apply

0.0 - 4.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Software Developer intern at URE Legal Advocates, you will be responsible for designing, developing, and deploying a cross-platform employee monitoring tool that includes screenshot capture, activity tracking, and timesheet monitoring features for Windows and macOS. You must have a strong foundation in Computer Science/Software Engineering, with expertise in algorithms, system design, distributed computing, operating systems, and secure data handling. Knowledge of DPDP Act, 2023 compliance, and global privacy frameworks such as GDPR and HIPAA is mandatory. Your key responsibilities will include: - Building cross-platform desktop applications for Windows & MacOS. - Implementing automatic screenshot capture, timesheet monitoring, idle detection, and productivity logging. - Designing modular, scalable, and extensible code architecture. - Creating Windows installers (MSI/EXE) and Mac installers (DMG). - Ensuring data protection compliance, including consent-based data collection, encryption, and user controls for data access, correction, and erasure. - Integrating with cloud APIs for secure storage and backup. - Conducting testing, maintenance, and optimization for performance and memory consumption. - Delivering fully functional software, real-time monitoring dashboard, compliance architecture, documentation, and regular software updates. Your core technical skills should include proficiency in programming languages such as C, C++, C#, Python, Java, Swift, Objective-C, cross-platform development frameworks, installer creation tools, system-level programming, database management, cloud services, security, compliance, DevOps, and deployment. You should also demonstrate mastery in areas like data structures & algorithms, operating systems, database systems, computer networks, software engineering principles, distributed systems, compiler design, cybersecurity, and privacy laws. Preferred qualifications include a degree in computer science/software engineering from a reputable university, previous experience with monitoring software, and publications/projects related to secure systems or distributed computing. Join us at URE Legal Advocates and be part of a dynamic team working on cutting-edge software solutions for employee monitoring and timesheet management.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

vadodara, gujarat

On-site

You will be responsible for creating blueprints for data flow, storage, and integration to ensure scalability, security, and efficiency. Your focus will be on the overall structure and strategy for data systems, designing the architecture for data warehouses, lakes, and integration platforms. You will define data governance, security, and compliance standards, as well as create strategies for data management and scalability. Your role will also involve providing input into developing data models and schemas and working closely with business stakeholders, data engineers, and analysts to define requirements and align the data strategy with business goals. Your outputs will include design documents, architecture diagrams, and data governance policies. To be considered for this role, you must have a total of 8+ years of experience in developing and managing Microsoft Data solutions, with at least 2+ years of experience in designing and architecting data solutions using Microsoft Azure. You should have expertise in database design, data modeling (e.g., star schema, snowflake schema), and hands-on experience with Azure Data services such as Power BI for Visualization, Azure Synapse (or AAS) for analytics, Azure Data Factory for ETL/Pipeline, Azure Data Lake Storage Gen 1/ Gen 2, and Azure Blob for storage and warehouse. Mandatory knowledge of using Microsoft Fabric (or Azure Synapse) and its components such as notebooks, shortcuts, data lake, and data warehouse administration is required. Proficiency in conceptual and logical design tools (e.g., Lucidchart) is also expected. Desired qualifications include a good understanding of programming languages such as Python or C#, knowledge of governance frameworks (Microsoft Purview) and cloud architecture (Azure Cloud Solutions), and Microsoft certified Fabric DP-600 or DP-700. Any other relevant Azure Data certification is considered an advantage, along with any Microsoft certification as an Azure Enterprise Data Architect or Azure Data Fundamental.,

Posted 1 week ago

Apply

8.0 - 10.0 years

25 - 35 Lacs

bengaluru

Work from Office

Job Responsibilities: Lead end-to-end development of ML/DL models from data preprocessing to model deployment Design and implement advanced solutions using computer vision, NLP, and generative AI models (e g , Transformers, GANs) Apply and experiment with agentic AI approaches to build autonomous decision-making systems Collaborate with engineering, product, and business stakeholders to align AI solutions with business outcomes Work with large-scale datasets and implement MLOps pipelines for automated training, evaluation, and deployment on cloud (Azure preferred) Stay up to date with the latest AI research and apply state-of-the-art techniques in production systems Mentor junior data scientists and contribute to AI knowledge sharing across teams Desired Skill: 8+ years of industry experience in data science, ML, or AI, with demonstrated project ownership Proficiency in Python and frameworks like TensorFlow, PyTorch, OpenCV, scikit-learn, etc Strong background in generative models (eg , LLMs, GANs, VAEs) and foundational ML techniques Expertise in computer vision, including object detection, image classification, and segmentation Experience in implementing agentic AI systems using modern orchestration tools and frameworks Hands-on experience with Azure AI services (e g, Azure ML, Cognitive Services, Azure OpenAI, Azure Blob for data handling) Solid understanding of MLOps practices (CI/CD, version control, model registry, monitoring) Strong analytical, problem-solving, and communication skills Value Add (If Any): Should have worked as a Team Leader Purely Data Science profile with ML, GenAi, Python exp Good exp in GenAI, ML Notice period- Immediate Qualification : Any Tech degree, masters Candidates should report in office for a period of 6 months, later it will can be a Hybrid work mode 1 round of interview F2F must, 2 Technical, 1 Project Delivery exp is a must Should know about project deliverables Should have at least 3 project deliveries, real time Notice Period : Immediate to 15 days

Posted 1 week ago

Apply

11.0 - 16.0 years

22 - 37 Lacs

chennai

Hybrid

Primary Responsibilities: Should be Curious and Quick learner for Tech needed for the project Should be able to Explore and Suggest new Ideas/Tech for the project Should be able to understand the feature requirements from the product and be able to come up with Technical Solutions keeping in mind the Scalability, Extensibility, Security, Cost etc. Come up with a plan break down the feature in workable stories Work closely with Product & Development team The Delivered feature should be of Quality with low defect rate Work collaboratively with the team and should be able to help others on technical issues or functional questions they might have Should be able to own a feature and run through with the team and deliver on agreed upon timelines. Should be very solid in analyzing & providing complete Technical solutions for the given problems. Should be able to perform good code reviews Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Qualifications - Required Qualifications: Bachelors Degree in Computer Science or related computer discipline Solid knowledge with 7+ years of experience in Postgres, Azure Cloud, Kafka/Pulsar 7+ years of experience in Azure cloud platform - Azure data factory, Azure data brick Experience in SQL Programing and SQL Data Base Solid Problem Solving capability Programming using Scala (or python) Fair in Architecting/Designing feature implementations & data integration mechanisms (sync/async, batch/stream), HLD/LLD Fair understanding on any FE framework like React Postgres, Azure Cloud, Kafka Azure Cloud Engineer, Azure data factory, Azure data bricks Programming using Scala Should have very good Problem solving skills Must Have Skills: Azure Data Engineer - Azure Data factory, Azure Blob, ADLS, Azure Databricks (preferably Scala), SQL programming knowledge

Posted 1 week ago

Apply

2.0 - 4.0 years

10 - 15 Lacs

pune

Hybrid

So, what’s the role all about? A Software Engineer in Data Platforms, Datalake and snowflake holds a prominent position in software development teams, responsible for designing, developing, and implementing complex data solutions that leverage both frontend and backend technologies. Here's an overview of the key responsibilities and qualifications for this role: How will you make an impact? Maintain quality, ensure responsiveness, and help optimize new and existing systems. Collaborate with the rest of the engineering team to design and build new features on time and to budget. Maintain code integrity and organization. Understanding and implementation of security and data protection. Understanding of the Business Change cycle from inception to implementation, including the organization of Change initiative Ability to coordinate build and release activities with key stakeholders. Have you got what it takes? Bachelor’s degree in computer science, Business Information Systems or related field or equivalent work experience is required. 3+ year experience in software development Hands-on with Snowflake Cloud Data Platform (minimum 2 years). Snowflake certifications is preferred. Strong experience in SQL Hands on experience on Python/Go. Understanding of Snowflake administration (roles, RBAC, warehouses). Experience with Snowflake advanced utilities, Time Travel & Fail-safe, resource monitors, alerts, and cost optimizations. Knowledge of snowflake data governance and security policies. Experience building Airflow DAGs and managing job dependencies. Experience with AWS Technology including (S3, SQS, Lambdas, Firehose, IAM, CloudWatch, etc) Excellent knowledge on Airflow. Knowledge of snowflake cloud storage integration (AWS S3 / GCS / Azure Blob) Experience in CI/CD, git, github Actions, Jenkins based pipeline deployments. Working knowledge of unit testing What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 8574 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

hyderabad

Work from Office

About The Role Project Role : Data Engineer Project Role Description :Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Python (Programming Language) Good to have skills : MySQL, Data Engineering, Kubernetes, DuckDB, GITMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Python Data Engineer, You will be responsible for designing, developing, and maintaining Python-based data pipelines, applications and services. You will collaborate with cross-functional teams to deliver high-quality software solutions that meet the needs of our business divisions. Roles & ResponsibilitiesDesign, develop, and maintain robust data pipelines using Python. Implement ETL (Extract, Transform, Load) processes to ingest data from various sources. Optimize and manage data storage solutions, ensuring data integrity and performance. Collaborate with other engineers and analysts to understand data requirements and deliver solutions. Monitor and troubleshoot data pipeline issues, ensuring timely resolution. Develop and maintain documentation for data engineering processes and systems Ensure data security and compliance with relevant regulations and Standards. Stay updated with the latest industry trends and technologies in data engineering. Professional & Technical Skills: Proven experience as a Data Engineer, with a focus on Python. Strong proficiency in Python programming and relevant libraries (e.g., Pandas, NumPy) Experience with SQL and database management systems (e.g., PostgresoL, MySQL) Familiarity with data pipeline orchestration tools such as Dagster. Experience with analytical databases like DuckDB. Knowledge of cloud storage solutions such as Amazon S3 or Azure Blob Storage. Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and related services. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes) Familiarity with data visualization tools (e.g., Tableau, Power BI) Understanding of machine learning concepts and frameworks. Knowledge of version control systems (e.g., Git) Additional Information:- The candidate should have minimum 6.5 years of experience in Python programming- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

4.0 - 9.0 years

7 - 11 Lacs

bengaluru

Work from Office

Seeking an experienced software engineer to develop highly visible and widely used tools, drivers and data integration products in the AstraDB, Apache Cassandra and DataStax Enterpriseecosystem. In this role, you will join a small team of talented engineers developing products that help developers succeed in efficiently delivering business value in high-scale applications. You will take ownership of projects and must be willing to jump in to help customers. What you will do : ? Participate in design and development of connectivity and data integration projects, including CQL drivers and analytics libraries for data pipelines (such as Cassandra Analytics) ? Integration work involving AstraDB, DSE, Apache Cassandra™, and other big data technologies, such as Apache Spark ? Resolve challenging and diverse customer issues ? Foster the development community surrounding both proprietary, and open source products ? As a team, own both development and quality aspects for all products ? Research and implement improvements of the product as well as the development infrastructure ? Perform regular code reviews ? Maintain product documentation, white papers, and educational materials ? Write technical blog posts relating to our technologies ? Ongoing education around relevant technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise ?4+ years of experience in software engineering ?Deep Expertise in at least one of the high-level, multi-paradigm languages with a preference for Java, Go, and/or Python ?Experience with any distributed database (such as DataStax Enterprise, Apache Cassandra, Amazon DynamoDB, Azure CosmosDB, or other cloud native database products) ?Command of asynchronous and concurrent programming concepts ?Knowledge of high-performance, low-latency network programming ?Being comfortable working with and contributing to open-source projects Preferred technical and professional experience ?Familiarity with creating and running containerized applications ?Experience with cloud environments for build and deployment ?Experience with large scale data processing pipelines, such as Apache Spark ?Familiarity with cloud-native technologies (e.g. Kubernetes) and object storage systems (e.g., AWS S3, GCP GCS, Azure Blob) ?Effective technical experience with Git and GitHub ?Experience with Jenkins, GitHub Actions or other CI/CD systems ?Proven ability to collaborate well in a globally distributed team environment

Posted 1 week ago

Apply

10.0 - 15.0 years

25 - 40 Lacs

new delhi, gurugram, delhi / ncr

Work from Office

Cloud Architect with 12+ yrs IT exp, 8+ yrs in AWS & Azure. Expertise in EC2, S3, Lambda, VMs, Blob Storage, cloud security, compliance, scalable solutions, and certified in AWS/Azure architecture. Proven experience as a Cloud Architect.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

kochi

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops- Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions. Ability to communicate results to technical and non-technical audiences.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

bengaluru

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp 3-6 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 2 weeks ago

Apply

9.0 - 11.0 years

20 - 25 Lacs

pune

Work from Office

Critical Skills to Possess: Webapps Backend - ASP.net MVC(Strong), C# (Strong), Integration with Azure MFA Authentication Database - SQL Server (Strong) UI - JavaScript, jQuery, Medtronic UI, Bootstrap, HTML/CSS, Draw flow (Plugin), Custom jQuery Grids. File Storage - Azure BLOB Storage Debugging skills to check Azure App service Logs and database Logs and should have skills to write ad hoc SQL Queries to find and fix any data issues. Should have strong skills in integration with external Token-based APIs. For ex Azure AD and Velocity APIs are used Azure API Integration Graph APIS to add/remove users from Azure AD Group Graph APIs to get User details from AD after Login Graph APIs to get Logged User AD Group APIs Function Apps Azure Functions - Creating and managing Azure Functions, understanding how to view logs and debug function calls/runs. VS Code IDE - Comfortable in development/deployment/debugging Azure functions from VS Code. HTTP Trigger Functions Service Bus Triggered Functions Timer Trigger Functions Version Control - Comfortable with Azure DevOps and Git for managing codebase and code branches. Deployment/Azure Components: Azure App Service, Azure SQL Database, Function apps, Azure Service Bus, Event Grid, DevOps Knowledge of using Repos, Branches, and Pull requests. Should be able to create Azure CI/CD Pipelines and use the same for Deploying Code Code versioning Tools Azure DevOps, Git, Tortoise Git Preferred Qualifications: BS degree in Computer Science or Engineering or equivalent experience Roles and Responsibilities Note : “ This role is for Duration of 6 Months Contract and Looking for immediate Joiners Only.” Roles and Responsibilities: Webapps Backend - ASP.net MVC(Strong), C# (Strong), Integration with Azure MFA Authentication Database - SQL Server (Strong) UI - JavaScript, jQuery, Medtronic UI, Bootstrap, HTML/CSS, Draw flow (Plugin), Custom jQuery Grids. File Storage - Azure BLOB Storage Debugging skills to check Azure App service Logs and database Logs and should have skills to write ad hoc SQL Queries to find and fix any data issues. Should have strong skills in integration with external Token-based APIs. For ex Azure AD and Velocity APIs are used Azure API Integration Graph APIS to add/remove users from Azure AD Group Graph APIs to get User details from AD after Login Graph APIs to get Logged User AD Group APIs Function Apps Azure Functions - Creating and managing Azure Functions, understanding how to view logs and debug function calls/runs. VS Code IDE - Comfortable in development/deployment/debugging Azure functions from VS Code. HTTP Trigger Functions Service Bus Triggered Functions Timer Trigger Functions Version Control - Comfortable with Azure DevOps and Git for managing codebase and code branches. Deployment/Azure Components: Azure App Service, Azure SQL Database, Function apps, Azure Service Bus, Event Grid, DevOps Knowledge of using Repos, Branches, and Pull requests. Should be able to create Azure CI/CD Pipelines and use the same for Deploying Code Code versioning Tools Azure DevOps, Git, Tortoise Git

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

As a Microsoft Dynamics 365 F&O Technical Consultant, you should have 8-10 years of experience in MS Dynamics ERP with expertise in Design, Development, and Infrastructure. Your technical and functional skills should include leading technical work streams in at least two full lifecycle D365 Fin & SCM implementations, understanding of LCS and Azure DevOps, proficiency in SQL, SSRS, PowerBI, C#, and the .Net framework. Moreover, you should have knowledge of end-to-end D365Fin & SCM implementation, Data Management Framework (DIXF), integration experience in Finance and SCM modules, and working with Dynamics AX 2009, 2012 & Dynamics 365. Experience in handling integration-related performance issues, knowledge of PowerApps, LogicApps, Common Data Service, and other Azure Services is also required. Your responsibilities will involve upgrading AX 2009 ERP to the D365 platform, bringing other ERPs into the D365 platform (Cloud version), daily tasks such as development, writing technical specifications, unit testing, bug fixing, and guiding junior technical team members. Additionally, you should possess additional skills in various MS Dynamics integration methods, tools, and technology, as well as experience in communication and collaboration with overseas teams. Generic managerial skills include effective communication with international clients and managing a team of functional finance consultants.,

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

Qualcomm India Private Limited is seeking a Software Engineer with strong experience in Python, Java, or Go. You should have a solid understanding of Restful APIs, gRPC, and GraphQL, along with experience in cloud technologies such as AWS, Azure, or GCP, as well as on-premises infrastructure. Familiarity with BLOB storage like S3 or Azure BLOB is required, and knowledge of cloud native technologies like API Gateways and Service Mesh is a plus. An understanding of databases including SQL and NoSQL is essential, and experience with IAC tools like Cloudformation and Terraform is preferred. You should also have knowledge of security best practices and possess good problem-solving and debugging skills. Experience with CI/CD systems like Jenkins, GitOps, Chef, and/or Ansible is highly valued. Minimum qualifications include a Bachelor's, Master's, or PhD degree in Engineering, Information Systems, Computer Science, or a related field, along with 2+ years of Software Engineering experience for Bachelor's degree holders, 1+ year for Master's degree holders, or relevant academic/work experience for Ph.D. holders. You should have 2+ years of experience with programming languages such as C, C++, Java, Python, etc., and work closely with team leads to understand use cases and requirements. Building proof-of-concepts, deploying, managing, and supporting scalable microservices on the cloud and on-premises are key responsibilities. Supporting users of the applications is also part of the role. The education requirements for this position are a B.E/M.E degree. Qualcomm is an equal opportunity employer and is committed to providing accessible processes for individuals with disabilities. If you need an accommodation during the application/hiring process, you can contact Qualcomm for support. Qualcomm expects its employees to adhere to all applicable policies and procedures, including security requirements for protecting company confidential information. Staffing and recruiting agencies are advised that Qualcomm's Careers Site is only for individuals seeking a job at Qualcomm, and unsolicited resumes or applications will not be accepted. For further information about this role, please reach out to Qualcomm Careers.,

Posted 2 weeks ago

Apply

3.0 - 6.0 years

14 - 18 Lacs

kochi

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops- Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 2 weeks ago

Apply

6.0 - 7.0 years

14 - 18 Lacs

kochi

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops- Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions. Ability to communicate results to technical and non-technical audiences.

Posted 2 weeks ago

Apply

6.0 - 7.0 years

14 - 18 Lacs

kochi

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops- Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions. Ability to communicate results to technical and non-technical audiences.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

11 - 16 Lacs

mumbai, pune, chennai

Work from Office

Database Development & Management Strong experience in SQL development and query optimization. Hands-on experience with Snowflake (preferred), Oracle, or any relational database. Understanding of database indexing, partitioning, and performance tuning. . ETL/ELT Development Expertise in ETL/ELT development, preferably using Talend (or any other ETL tool). Strong proficiency in Advanced SQL scripting for data transformation and processing. Ability to design, develop, and optimize data pipelines for structured/unstructured data. Experience in error handling, logging, and recovery mechanisms for ETL processes. Data Warehousing & Modeling Understanding of Data Warehousing concepts, including Star Schema and Dimension Modeling. Hands-on experience with Snowflake (preferred) or other cloud/on-prem data warehouses. Ability to design and maintain Fact and Dimension tables for analytics and reporting. Knowledge of data partitioning and performance tuning techniques in DWH environments. CI/CD & Version Control (Good to Have, Not Core Focus) Experience using GIT for version control of ETL scripts and database objects. Exposure to CI/CD tools like TeamCity (preferred), Jenkins, or Azure DevOps for ETL deployment automation. Understanding of branching strategies, merging, and automated deployments for ETL processes. Familiarity with scheduled job execution and monitoring via CI/CD tools. Cloud Exposure (Good to Have, Not Core Focus) Basic familiarity with Azure or AWS cloud environments. Understanding of Snowflake on Azure/AWS or Redshift on AWS (data storage, querying, and schema management). Exposure to cloud storage solutions (Azure Blob, AWS S3) for data ingestion and staging. Awareness of cloud-based ETL services (Azure Data Factory, AWS Glue) preferred but not required.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

18 - 30 Lacs

chennai, bengaluru

Hybrid

We are seeking a highly skilled Azure Cloud Platform Engineer with strong expertise in Terraform and Azure cloud services to join our team. The candidate will be responsible for designing, developing, maintaining, and releasing Terraform modules and landing zones for Azure environments, including platform landing zones, application landing zones, and application infrastructure deployments. The role demands ownership of the entire lifecycle of these modulesfrom development, bug fixing, feature integration, to testing and release management. Key Responsibilities: Develop, maintain, and release Terraform modules, landing zones, and infrastructure patterns tailored for Azure cloud environments. Provide ongoing lifecycle support for deployed patterns, including troubleshooting, bug fixing, and applying maintenance updates. Integrate new features into existing Terraform modules to enhance capabilities and meet evolving platform requirements. Ensure timely release of updated and new patterns aligned with project and enterprise standards. Update and maintain automated test cases for Terraform modules and landing zones to guarantee reliability, security, and performance. Collaborate with cross-functional teams including DevOps, cloud architects, and developers to implement best practices in infrastructure as code (IaC). Support Azure cloud migrations and infrastructure deployments by creating reusable and scalable Terraform patterns. Monitor and optimize Azure cloud services including compute, storage, and networking resources. Maintain documentation related to Terraform modules, deployment patterns, and infrastructure standards. Qualifications & Skills: Experience: Minimum 6 years in cloud engineering with at least 3+ years focused on Azure cloud migration and infrastructure development . Strong hands-on experience with Terraform , including writing, debugging, and managing infrastructure as code modules. Deep knowledge of Azure cloud services , including: Compute: Azure Virtual Machines (VM), Azure Kubernetes Service (AKS) Storage: Azure Blob Storage, Azure Managed Disks, Azure Files Networking and security: Azure Traffic Manager, Azure Security Center, Azure Policy Azure subscriptions and governance frameworks (Azure Landing Zones) Experience managing Terraform lifecycle , releases, version control, and CI/CD pipeline integrations. Foundational understanding of Azure Database Migration Service (DMS) , migrating Oracle or MySQL databases to Azure SQL is desirable. Strong scripting and automation skills. Azure certifications preferred: Azure Fundamentals (AZ-900) – Mandatory Azure Developer (AZ-204) or Azure Solutions Architect (AZ-305) – Mandatory Terraform Associate Certification – Mandatory and must be valid Enterprise-level experience with Azure Landing Zones, VM, AKS, Traffic Manager, File Storage and other core cloud services (5+ years preferred). Familiarity with cloud security best practices and compliance standards. Excellent problem-solving, communication, and collaboration skills. Ability to work effectively in Agile and DevOps environments.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

bengaluru

Work from Office

Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

8 - 13 Lacs

bengaluru

Work from Office

Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus.

Posted 3 weeks ago

Apply

6.0 - 7.0 years

14 - 18 Lacs

kochi

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 3 weeks ago

Apply

6.0 - 7.0 years

14 - 18 Lacs

bengaluru

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

bengaluru

Work from Office

Role Description: Technical Requirement: Key Responsibilities: We are looking for motivated individuals to deliver innovative digital solutions through design, development, deployment, and support. The Senior Integration Software Engineer role involves designing, developing, testing, and maintaining API & Integration solutions using Microsoft Azure Integration Services. The developer will collaborate with cross-functional teams to gather requirements, optimize performance, and ensure scalability and reliability. Key skills include proficiency in Azure Integration services, C#, .NET, RESTful APIs, relational databases, and implementing best practices for code quality and security.Required Skills: Design comprehensive integration architectures, leading complex integration projects, establishing integration standards and practices. Mentor junior engineers. Design, develop, and maintain API & integration solutions using Microsoft Azure Integration services (e.g., Azure Logic Apps, Azure Functions, Azure API Management) to facilitate communication between different systems Collaborate with cross-functional teams to gather requirements and design integration solutions that meet business needs. Implement and manage data integration processes, ensuring data accuracy and consistency across systems. Optimize API & integration solutions for performance, scalability, and reliability. Troubleshoot and resolve integration issues in a timely manner. Design and management of relational database schemas, performance, and queries. Utilize data lake technologies (e.g., Azure Data Lake, Azure Blob Storage) to store and process large volumes of data. Document API & integration processes, configurations, and best practices. Participate in code reviews and contribute to the continuous improvement of development processes. Stay current with industry trends and emerging technologies to ensure our integration solutions remain cutting-edge.Required Qualification: Design comprehensive integration architectures, leading complex integration projects, establishing integration standards and practices. Mentor junior engineers. Design, develop, and maintain API & integration solutions using Microsoft Azure Integration services (e.g., Azure Logic Apps, Azure Functions, Azure API Management) to facilitate communication between different systems Collaborate with cross-functional teams to gather requirements and design integration solutions that meet business needs. Implement and manage data integration processes, ensuring data accuracy and consistency across systems. Optimize API & integration solutions for performance, scalability, and reliability. Troubleshoot and resolve integration issues in a timely manner. Design and management of relational database schemas, performance, and queries. Utilize data lake technologies (e.g., Azure Data Lake, Azure Blob Storage) to store and process large volumes of data. Document API & integration processes, configurations, and best practices. Participate in code reviews and contribute to the continuous improvement of development processes. Stay current with industry trends and emerging technologies to ensure our integration solutions remain cutting-edge.Preferred Qualification: Design comprehensive integration architectures, leading complex integration projects, establishing integration standards and practices. Mentor junior engineers. Design, develop, and maintain API & integration solutions using Microsoft Azure Integration services (e.g., Azure Logic Apps, Azure Functions, Azure API Management) to facilitate communication between different systems Collaborate with cross-functional teams to gather requirements and design integration solutions that meet business needs. Implement and manage data integration processes, ensuring data accuracy and consistency across systems. Optimize API & integration solutions for performance, scalability, and reliability. Troubleshoot and resolve integration issues in a timely manner. Design and management of relational database schemas, performance, and queries. Utilize data lake technologies (e.g., Azure Data Lake, Azure Blob Storage) to store and process large volumes of data. Document API & integration processes, configurations, and best practices. Participate in code reviews and contribute to the continuous improvement of development processes. Stay current with industry trends and emerging technologies to ensure our integration solutions remain cutting-edge.

Posted 4 weeks ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies