Jobs
Interviews

32 Azure Search Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 9 Lacs

coimbatore

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft BOT Framework Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : BE/B-Tech/M-Tech Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Microsoft BOT FrameworkGood to Have Skills :No Technology SpecializationJob Requirements :Key Responsibilities :AContribute in engineering activities and POCs, POV creationsBExperience in building complex conversational AI solutions CWell versed with the large scale development projects, AZURE cloud technology, Microsoft NLP and cognitive services, Docker Container, Azure DevOps Technical Experience :A5 years of experience in MS Bot Framework, MS Bot ComposerBPower Virtual Agent and Power AutomateCAzure PAAS App Services, Storage, Cosmos DB and Scaling conceptsDDockers and containersEAzure Cognitive Services, Azure Search Professional Attributes :AShould have strong problem-solving and Analytical skills BGood Communication skills CAbility to work independently with little supervision or as a team DShould be able to work and deliver under tight timelines ECandidate should have good analytical skills Educational Qualification:BE/B-Tech/M-TechAdditional Info : Qualification BE/B-Tech/M-Tech

Posted 4 days ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

bengaluru

Work from Office

What this job involves: JLL, an international real estate management company, is seeking an Data Engineer to join our JLL Technologies Team. We are seeking candidates that are self-starters to work in a diverse and fast-paced environment that can join our Enterprise Data team. We are looking for a candidate that is responsible for designing and developing of data solutions that are strategic for the business using the latest technologies Azure Databricks, Python, PySpark, SparkSQL, Azure functions, Delta Lake, Azure DevOps CI/CD. Responsibilities Design, Architect, and Develop solutions leveraging cloud big data technology to ingest, process and analyze large, disparate data sets to exceed business requirements. Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities. Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate. Develop data lake solution to store structured and unstructured data from internal and external sources and provide technical guidance to help migrate colleagues to modern technology platform. Contribute and adhere to CI/CD processes, development best practices and strengthen the discipline in Data Engineering Org. Develop systems that ingest, cleanse and normalize diverse datasets, develop data pipelines from various internal and external sources and build structure for previously unstructured data. Using PySpark and Spark SQL, extract, manipulate, and transform data from various sources, such as databases, data lakes, APIs, and files, to prepare it for analysis and modeling. Build and optimize ETL workflows using Azure Databricks and PySpark. This includes developing efficient data processing pipelines, data validation, error handling, and performance tuning. Perform the unit testing, system integration testing, regression testing and assist with user acceptance testing. Articulates business requirements in a technical solution that can be designed and engineered. Consults with the business to develop documentation and communication materials to ensure accurate usage and interpretation of JLL data. Implement data security best practices, including data encryption, access controls, and compliance with data protection regulations. Ensure data privacy, confidentiality, and integrity throughout the data engineering processes. Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Experience & Education Minimum of 4 years of experience as a data developer using Python, PySpark, Spark Sql, ETL knowledge, SQL Server, ETL Concepts. Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Experience in Azure Cloud Platform, Databricks, Azure storage. Effective written and verbal communication skills, including technical writing. Excellent technical, analytical and organizational skills. Technical Skills & Competencies Experience handling un-structured, semi-structured data, working in a data lake environment, leveraging data streaming and developing data pipelines driven by events/queues Hands on Experience and knowledge on real time/near real time processing and ready to code Hands on Experience in PySpark, Databricks, and Spark Sql. Knowledge on json, Parquet and Other file format and work effectively with them No Sql Databases Knowledge like Hbase, Mongo, Cosmos etc. Preferred Cloud Experience on Azure or AWS Python-spark, Spark Streaming, Azure SQL Server, Cosmos DB/Mongo DB, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams.

Posted 6 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

navi mumbai

Work from Office

We are looking for a Senior Data Engineer who is self-starter to work in a diverse and fast-paced environment that can join our Enterprise Data team. This is an individual contributor role that is responsible for designing and developing of data solutions that are strategic for the business and built on the latest technologies and patterns. This a global role that requires partnering with the broader JLLT team at the country, regional and global level by utilizing in-depth knowledge of data, infrastructure, technologies and data engineering experience. As a Data Engineer 2 at JLL Technologies, you will: Contributes to the design of information infrastructure, and data management processes to move the organization to a more sophisticated, agile and robust target state data architecture Develop systems that ingest, cleanse and normalize diverse datasets, develop data pipelines from various internal and external sources and build structure for previously unstructured data Develop good understanding of how data will flow & stored through an organization across multiple applications such as CRM, Broker & Sales tools, Finance, HR etc Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate What we are looking for: 4+ years overall work experience and bachelor's degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Minimum of 3 years of experience as a data developer using Python, Kafka, Spark Streaming, Azure SQL Server, Cosmos DB/Mongo DB, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Excellent technical, analytical and organizational skills. Effective written and verbal communication skills, including technical writing. Hands-on engineering lead who is curious about technology, should be able to quickly adopt to change and one who understands the technologies supporting areas such as Cloud Computing (AWS, Azure(preferred), etc.), Micro Services, Streaming Technologies, Network, Security etc Hands-on Experience for building Data Pipelines in Cloud. Experience in working with databases especially SQL server databases. Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities. Experience handling un-structured data, working in a data lake environment, leveraging data streaming and developing data pipelines driven by events/queues Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams

Posted 6 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

bengaluru

Work from Office

We are looking for a Senior Data Engineer who is self-starter to work in a diverse and fast-paced environment that can join our Enterprise Data team. This is an individual contributor role that is responsible for designing and developing of data solutions that are strategic for the business and built on the latest technologies and patterns. This a global role that requires partnering with the broader JLLT team at the country, regional and global level by utilizing in-depth knowledge of data, infrastructure, technologies and data engineering experience. As a Data Engineer 2 at JLL Technologies, you will: Contributes to the design of information infrastructure, and data management processes to move the organization to a more sophisticated, agile and robust target state data architecture Develop systems that ingest, cleanse and normalize diverse datasets, develop data pipelines from various internal and external sources and build structure for previously unstructured data Develop good understanding of how data will flow & stored through an organization across multiple applications such as CRM, Broker & Sales tools, Finance, HR etc Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate What we are looking for: 4+ years overall work experience and bachelor's degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Minimum of 3 years of experience as a data developer using Python, Kafka, Spark Streaming, Azure SQL Server, Cosmos DB/Mongo DB, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Excellent technical, analytical and organizational skills. Effective written and verbal communication skills, including technical writing. Hands-on engineering lead who is curious about technology, should be able to quickly adopt to change and one who understands the technologies supporting areas such as Cloud Computing (AWS, Azure(preferred), etc.), Micro Services, Streaming Technologies, Network, Security etc Hands-on Experience for building Data Pipelines in Cloud. Experience in working with databases especially SQL server databases. Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities. Experience handling un-structured data, working in a data lake environment, leveraging data streaming and developing data pipelines driven by events/queues Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams

Posted 6 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

ranchi

Work from Office

We are looking for a Strategic thinker, have ability grasp new technologies, innovate, develop, and nurture new solutions. a self-starter to work in a diverse and fast-paced environment to support, maintain and advance the capabilities of the unified data platform. This a global role that requires partnering with the broader JLLT team at the country, regional and global level by utilizing in-depth knowledge of cloud infrastructure technologies and platform engineering experience. Responsibilities Working with the application teams to prioritize new requests for functionality. Specifically, new user-facing functionality (e.g., the ability to ingest IoT data, subscription-based consumption, etc.) Addressing internal functionality (e.g., monitoring and alerting based on application performance, automated testing frameworks, etc.) Managing respective support queues (e.g., Ingest, Prepare, Storage and Consume, etc.) Note: agreed upon SLAs will be established post burn-in period Manage backlog via effective sprint planning based on feedback from the application teams. Mentoring and coaching the application teams on tools, technology and design patterns. Ensuring that the production environment is well built and that there is a clear escalation path for production issues Ensuring solution architecture meets JLL's requirements including, but not limited to, those regarding cloud spend, scalability, performance, etc. Developing infrastructure that is scalable, reliable, and monitored. Building a relationship with Cloud providers, to take advantage of their most appropriate technology offerings. Collaborating with the application team leads to ensure that the application teams' needs are met through the CI/CD framework, component monitoring and stats, incident escalation. Lead teams for a discovery and architecture workshop, influence client architects, and IT personnel Guide other architects working with you in the team. Adapt communications and approaches to conclude technical scope discussions with various Partners, resulting in Common Agreements. Deliver an optimized infrastructure services design leveraging public, private, and hybrid Cloud architectures and services Act as subject matter and implementation expert for the client as related to technical architecture and implementation of proposed solution using Cloud Services Inculcating "infrastructure as code" mentality in the Platform team overall. Create and maintain incident management requests to product group/engineering group. Analyse Complex application landscapes, anticipate potential problems and future trends, assess potential solutions, Impacts, and risks to propose cloud roadmap & solution architecture Develop and implement cloud architecture solutions based on AWS/Azure/GCP Cloud when assigned to work on delivery projects. Analyse client requirements, propose for overall Application modernization, migrations and green field implementations Experience in implementing and deploying a DevOps based, end to end cloud application. Sounds like you? To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Worked with Cloud delivery team to provide technical solutions services roadmap for customer. Knowledge on creating IaaS and PaaS cloud solutions in Azure Platform that meet customer needs for scalability, reliability, and performance Technical Skills & Competencies Must have experience in application development on Azure Cloud. Must possess business level understanding of enterprise application systems to drive innovations and transformations. Should have very good understanding of Cloud Native services and how they ascribe to application requirements. Minimum of 3-5 years of relevant experience with, API ingestion, file ingestion, batch transformation, metadata management, monitoring, pub/sub consumption, RDBMS ingestion and real-time transformation. Minimum of 3-5 years using the following technology or equivalent:GithubActionns, Azure DevOps, Azure Functions, Azure Batch using Python, C#, or NodeJS, Azure APIM , Azure Event Hub, Azure Data Lake Storage (Gen 2), Azure Monitor, Azure Table Storage, Azure Databricks, Azure SQL Database, Azure Search, Azure Cosmo Data Store, and Azure SignalR. Work with infrastructure team and deploy applications on cloud using blue green or brown field deployments. Ability to provide holistic and right scale cloud solutions that addresses scalability, availability, service continuity (DR), and performance and security requirements. Help customers by supporting scalable and highly available applications leveraging cloud services. Scripting and Automation skills using CLI, Python, PowerShell Clear understanding of IAM roles and policies and how to attach them to business entities and users. Provide deep development knowledge with respect to cloud architecture, design patterns. Design understanding and experience in RDBMS, NoSQL and RDS. Exposure to PaaS technologies and Containers like Docker, Kubernetes. Should understand costing oerent cloud services. Should have experience with Azure Cloud Infrastructure. Should have experience with CI CD tools such as Azure Devops, GitHub, Github Actions. Understanding of Application architecture and Enterprise Architecture is a must. What we can do for you: Youll join an entrepreneurial, inclusive culture. One where we succeed together across the desk and around the globe. Where like-minded people work naturally together to achieve great things. Our Total Rewards program reflects our commitment to helping you achieve your ambitions in career, recognition, well-being, benefits and pay. Join us to develop your strengths and enjoy a fulfilling career full of varied experiences. Keep those ambitions in sights and imagine where JLL can take you... Apply today! Location: On-site Bengaluru, KA Scheduled Weekly Hours: 40 If this job description resonates with you, we encourage you to apply even if you dont meet all of the requirements. Were interested in getting to know you and what you bring to the table! JLL Privacy Notice Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLLs recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely. For more information about how JLL processes your personal data, please view our . For additional details please see our career site pages for each country. For candidates in the United States, please see a full copy of our Equal Employment Opportunity and Affirmative Action policy . Jones Lang LaSalle (JLL) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process including the online application and/or overall selection process you may contact us at . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our page > I want to work for JLL.

Posted 6 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

bengaluru

Work from Office

We are looking for a Strategic thinker, have ability grasp new technologies, innovate, develop, and nurture new solutions. a self-starter to work in a diverse and fast-paced environment to support, maintain and advance the capabilities of the unified data platform. This a global role that requires partnering with the broader JLLT team at the country, regional and global level by utilizing in-depth knowledge of cloud infrastructure technologies and platform engineering experience. Responsibilities Working with the application teams to prioritize new requests for functionality. Specifically, new user-facing functionality (e.g., the ability to ingest IoT data, subscription-based consumption, etc.) Addressing internal functionality (e.g., monitoring and alerting based on application performance, automated testing frameworks, etc.) Managing respective support queues (e.g., Ingest, Prepare, Storage and Consume, etc.) Note: agreed upon SLAs will be established post burn-in period Manage backlog via effective sprint planning based on feedback from the application teams. Mentoring and coaching the application teams on tools, technology and design patterns. Ensuring that the production environment is well built and that there is a clear escalation path for production issues Ensuring solution architecture meets JLL's requirements including, but not limited to, those regarding cloud spend, scalability, performance, etc. Developing infrastructure that is scalable, reliable, and monitored. Building a relationship with Cloud providers, to take advantage of their most appropriate technology offerings. Collaborating with the application team leads to ensure that the application teams' needs are met through the CI/CD framework, component monitoring and stats, incident escalation. Lead teams for a discovery and architecture workshop, influence client architects, and IT personnel Guide other architects working with you in the team. Adapt communications and approaches to conclude technical scope discussions with various Partners, resulting in Common Agreements. Deliver an optimized infrastructure services design leveraging public, private, and hybrid Cloud architectures and services Act as subject matter and implementation expert for the client as related to technical architecture and implementation of proposed solution using Cloud Services Inculcating "infrastructure as code" mentality in the Platform team overall. Create and maintain incident management requests to product group/engineering group. Analyse Complex application landscapes, anticipate potential problems and future trends, assess potential solutions, Impacts, and risks to propose cloud roadmap & solution architecture Develop and implement cloud architecture solutions based on AWS/Azure/GCP Cloud when assigned to work on delivery projects. Analyse client requirements, propose for overall Application modernization, migrations and green field implementations Experience in implementing and deploying a DevOps based, end to end cloud application. Sounds like you? To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Worked with Cloud delivery team to provide technical solutions services roadmap for customer. Knowledge on creating IaaS and PaaS cloud solutions in Azure Platform that meet customer needs for scalability, reliability, and performance Technical Skills & Competencies Must have experience in application development on Azure Cloud. Must possess business level understanding of enterprise application systems to drive innovations and transformations. Should have very good understanding of Cloud Native services and how they ascribe to application requirements. Minimum of 3-5 years of relevant experience with, API ingestion, file ingestion, batch transformation, metadata management, monitoring, pub/sub consumption, RDBMS ingestion and real-time transformation. Minimum of 3-5 years using the following technology or equivalent:GithubActionns, Azure DevOps, Azure Functions, Azure Batch using Python, C#, or NodeJS, Azure APIM , Azure Event Hub, Azure Data Lake Storage (Gen 2), Azure Monitor, Azure Table Storage, Azure Databricks, Azure SQL Database, Azure Search, Azure Cosmo Data Store, and Azure SignalR. Work with infrastructure team and deploy applications on cloud using blue green or brown field deployments. Ability to provide holistic and right scale cloud solutions that addresses scalability, availability, service continuity (DR), and performance and security requirements. Help customers by supporting scalable and highly available applications leveraging cloud services. Scripting and Automation skills using CLI, Python, PowerShell Clear understanding of IAM roles and policies and how to attach them to business entities and users. Provide deep development knowledge with respect to cloud architecture, design patterns. Design understanding and experience in RDBMS, NoSQL and RDS. Exposure to PaaS technologies and Containers like Docker, Kubernetes. Should understand costing oerent cloud services. Should have experience with Azure Cloud Infrastructure. Should have experience with CI CD tools such as Azure Devops, GitHub, Github Actions. Understanding of Application architecture and Enterprise Architecture is a must. What we can do for you: Youll join an entrepreneurial, inclusive culture. One where we succeed together across the desk and around the globe. Where like-minded people work naturally together to achieve great things. Our Total Rewards program reflects our commitment to helping you achieve your ambitions in career, recognition, well-being, benefits and pay. Join us to develop your strengths and enjoy a fulfilling career full of varied experiences. Keep those ambitions in sights and imagine where JLL can take you... Apply today! Location: On-site Bengaluru, KA Scheduled Weekly Hours: 40 If this job description resonates with you, we encourage you to apply even if you dont meet all of the requirements. Were interested in getting to know you and what you bring to the table! JLL Privacy Notice Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLLs recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely. For more information about how JLL processes your personal data, please view our . For additional details please see our career site pages for each country. For candidates in the United States, please see a full copy of our Equal Employment Opportunity and Affirmative Action policy . Jones Lang LaSalle (JLL) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process including the online application and/or overall selection process you may contact us at . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our page > I want to work for JLL.

Posted 6 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

chennai

Work from Office

We are looking for a Senior Data Engineer who is self-starter to work in a diverse and fast-paced environment that can join our Enterprise Data team. This is an individual contributor role that is responsible for designing and developing of data solutions that are strategic for the business and built on the latest technologies and patterns. This a global role that requires partnering with the broader JLLT team at the country, regional and global level by utilizing in-depth knowledge of data, infrastructure, technologies and data engineering experience. As a Data Engineer 2 at JLL Technologies, you will: Contributes to the design of information infrastructure, and data management processes to move the organization to a more sophisticated, agile and robust target state data architecture Develop systems that ingest, cleanse and normalize diverse datasets, develop data pipelines from various internal and external sources and build structure for previously unstructured data Develop good understanding of how data will flow & stored through an organization across multiple applications such as CRM, Broker & Sales tools, Finance, HR etc Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate What we are looking for: 4+ years overall work experience and bachelor's degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Minimum of 3 years of experience as a data developer using Python, Kafka, Spark Streaming, Azure SQL Server, Cosmos DB/Mongo DB, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Excellent technical, analytical and organizational skills. Effective written and verbal communication skills, including technical writing. Hands-on engineering lead who is curious about technology, should be able to quickly adopt to change and one who understands the technologies supporting areas such as Cloud Computing (AWS, Azure(preferred), etc.), Micro Services, Streaming Technologies, Network, Security etc Hands-on Experience for building Data Pipelines in Cloud. Experience in working with databases especially SQL server databases. Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities. Experience handling un-structured data, working in a data lake environment, leveraging data streaming and developing data pipelines driven by events/queues Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams

Posted 6 days ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

thane

Work from Office

What this job involves: JLL, an international real estate management company, is seeking an Data Engineer to join our JLL Technologies Team. We are seeking candidates that are self-starters to work in a diverse and fast-paced environment that can join our Enterprise Data team. We are looking for a candidate that is responsible for designing and developing of data solutions that are strategic for the business using the latest technologies Azure Databricks, Python, PySpark, SparkSQL, Azure functions, Delta Lake, Azure DevOps CI/CD. Responsibilities Design, Architect, and Develop solutions leveraging cloud big data technology to ingest, process and analyze large, disparate data sets to exceed business requirements. Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities. Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate. Develop data lake solution to store structured and unstructured data from internal and external sources and provide technical guidance to help migrate colleagues to modern technology platform. Contribute and adhere to CI/CD processes, development best practices and strengthen the discipline in Data Engineering Org. Develop systems that ingest, cleanse and normalize diverse datasets, develop data pipelines from various internal and external sources and build structure for previously unstructured data. Using PySpark and Spark SQL, extract, manipulate, and transform data from various sources, such as databases, data lakes, APIs, and files, to prepare it for analysis and modeling. Build and optimize ETL workflows using Azure Databricks and PySpark. This includes developing efficient data processing pipelines, data validation, error handling, and performance tuning. Perform the unit testing, system integration testing, regression testing and assist with user acceptance testing. Articulates business requirements in a technical solution that can be designed and engineered. Consults with the business to develop documentation and communication materials to ensure accurate usage and interpretation of JLL data. Implement data security best practices, including data encryption, access controls, and compliance with data protection regulations. Ensure data privacy, confidentiality, and integrity throughout the data engineering processes. Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues. Experience & Education Minimum of 4 years of experience as a data developer using Python, PySpark, Spark Sql, ETL knowledge, SQL Server, ETL Concepts. Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Experience in Azure Cloud Platform, Databricks, Azure storage. Effective written and verbal communication skills, including technical writing. Excellent technical, analytical and organizational skills. Technical Skills & Competencies Experience handling un-structured, semi-structured data, working in a data lake environment, leveraging data streaming and developing data pipelines driven by events/queues Hands on Experience and knowledge on real time/near real time processing and ready to code Hands on Experience in PySpark, Databricks, and Spark Sql. Knowledge on json, Parquet and Other file format and work effectively with them No Sql Databases Knowledge like Hbase, Mongo, Cosmos etc. Preferred Cloud Experience on Azure or AWS Python-spark, Spark Streaming, Azure SQL Server, Cosmos DB/Mongo DB, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams.

Posted 6 days ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

faridabad

Work from Office

remote typeOn-site locationsBengaluru, KA time typeFull time posted onPosted 6 Days Ago job requisition idREQ388907 Senior Software Engineer Software Engineering JLL/Technologies Centre of Expertise (JLLT CoE) (Bangalore) What this job involves: About the role #JLLTechAmbitions The JLL Digital Products team aims to bring successful technology-based products to market in a high-growth environment. The teams mission is focused on accelerating technology adoption in commercial real estate by bringing creative, innovative and technical solutions to solve large, complex problems for our clients. The Senior Software Engineer is a key position on the Digital Products team, responsible for defining the key framework and implementing various components for JLLs Products Responsibilities This is an individual contributor role that is responsible for the designing and developing of Product features that are strategic for the business and built on the latest technologies and patterns. This a global role that requires partnering with both the global engineering teams and Product team. Design and implement high-performance, scalable, and maintainable software solutions using Java and Spring Boot. Develop and optimize GraphQL APIs to ensure efficient data querying and manipulation across our platforms. Work with big data technologies, particularly Databricks and PySpark, to process and analyze large datasets Collaborate with product managers and UX designers to translate business requirements into technical specifications. Write clean, maintainable, and well-documented code Participate in code reviews and contribute to improving our development processes Troubleshoot, debug, and upgrade existing systems Stay abreast of emerging technologies and industry trends in Protech and software development. Evaluate new technologies and frameworks that could benefit our products and services. Contribute to the company's technology roadmap by proposing innovative solutions to complex problems. Interfaces with internal colleagues and external professionals to determine requirements, anticipate future needs, and identify areas of opportunity to drive data development Develop good understanding of how data will flow & stored through an organization across multiple applications such as CRM, Broker & Sales, Finance, HR, MDM, ODS, Data Lake, & EDW Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate Sounds like you? To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Hands-on engineering lead who is curious about technology, should be able to quickly adopt to change and one who understands the technologies supporting areas such as Cloud Computing (AWS, Azure(preferred), etc.), Micro Services, Streaming Technologies, Network, Security, etc. Technical Skills & Competencies 7+ years of experience as a Software developer using Java, Springboot, GraphQL, Python-spark, Databricks, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Hands-on experience of building Data Pipelines in Cloud and well versed with CICD and DevOps process. Experience with React for front-end development is a plus. Familiarity with event-driven architectures and message queues (e.g., Kafka, RabbitMQ) Experience of handling un-structured data, working in a data lake environment, leveraging data streaming and developing data pipelines driven by events/queues Experience building and maintaining a data warehouse/ data lake in a production environment with efficient ETL design, implementation, and maintenance Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams What we can do for you: At JLL, we make sure that you become the best version of yourself by helping you realise your full potential in a fully entrepreneurial and inclusive work environment. If you harbour passion for learning and adapting new technologies, JLL will continuously provide you with platforms to enrich your technical domains. We will empower your ambitions through our dedicated Total Rewards Program, competitive pay and benefits package. Apply today! Location On-site Bengaluru, KA Scheduled Weekly Hours: 40 JLL Privacy Notice Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLLs recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely. For more information about how JLL processes your personal data, please view our . For additional details please see our career site pages for each country. For candidates in the United States, please see a full copy of our Equal Employment Opportunity and Affirmative Action policy . Jones Lang LaSalle (JLL) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process including the online application and/or overall selection process you may contact us at . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our page > I want to work for JLL.

Posted 6 days ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

gurugram

Work from Office

remote typeOn-site locationsBengaluru, KA time typeFull time posted onPosted 6 Days Ago job requisition idREQ388907 Senior Software Engineer Software Engineering JLL/Technologies Centre of Expertise (JLLT CoE) (Bangalore) What this job involves: About the role #JLLTechAmbitions The JLL Digital Products team aims to bring successful technology-based products to market in a high-growth environment. The teams mission is focused on accelerating technology adoption in commercial real estate by bringing creative, innovative and technical solutions to solve large, complex problems for our clients. The Senior Software Engineer is a key position on the Digital Products team, responsible for defining the key framework and implementing various components for JLLs Products Responsibilities This is an individual contributor role that is responsible for the designing and developing of Product features that are strategic for the business and built on the latest technologies and patterns. This a global role that requires partnering with both the global engineering teams and Product team. Design and implement high-performance, scalable, and maintainable software solutions using Java and Spring Boot. Develop and optimize GraphQL APIs to ensure efficient data querying and manipulation across our platforms. Work with big data technologies, particularly Databricks and PySpark, to process and analyze large datasets Collaborate with product managers and UX designers to translate business requirements into technical specifications. Write clean, maintainable, and well-documented code Participate in code reviews and contribute to improving our development processes Troubleshoot, debug, and upgrade existing systems Stay abreast of emerging technologies and industry trends in Protech and software development. Evaluate new technologies and frameworks that could benefit our products and services. Contribute to the company's technology roadmap by proposing innovative solutions to complex problems. Interfaces with internal colleagues and external professionals to determine requirements, anticipate future needs, and identify areas of opportunity to drive data development Develop good understanding of how data will flow & stored through an organization across multiple applications such as CRM, Broker & Sales, Finance, HR, MDM, ODS, Data Lake, & EDW Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate Sounds like you? To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Hands-on engineering lead who is curious about technology, should be able to quickly adopt to change and one who understands the technologies supporting areas such as Cloud Computing (AWS, Azure(preferred), etc.), Micro Services, Streaming Technologies, Network, Security, etc. Technical Skills & Competencies 7+ years of experience as a Software developer using Java, Springboot, GraphQL, Python-spark, Databricks, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Hands-on experience of building Data Pipelines in Cloud and well versed with CICD and DevOps process. Experience with React for front-end development is a plus. Familiarity with event-driven architectures and message queues (e.g., Kafka, RabbitMQ) Experience of handling un-structured data, working in a data lake environment, leveraging data streaming and developing data pipelines driven by events/queues Experience building and maintaining a data warehouse/ data lake in a production environment with efficient ETL design, implementation, and maintenance Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams What we can do for you: At JLL, we make sure that you become the best version of yourself by helping you realise your full potential in a fully entrepreneurial and inclusive work environment. If you harbour passion for learning and adapting new technologies, JLL will continuously provide you with platforms to enrich your technical domains. We will empower your ambitions through our dedicated Total Rewards Program, competitive pay and benefits package. Apply today! Location On-site Bengaluru, KA Scheduled Weekly Hours: 40 JLL Privacy Notice Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLLs recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely. For more information about how JLL processes your personal data, please view our . For additional details please see our career site pages for each country. For candidates in the United States, please see a full copy of our Equal Employment Opportunity and Affirmative Action policy . Jones Lang LaSalle (JLL) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process including the online application and/or overall selection process you may contact us at . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our page > I want to work for JLL.

Posted 6 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

visakhapatnam

Work from Office

We are looking for a Strategic thinker, have ability grasp new technologies, innovate, develop, and nurture new solutions. a self-starter to work in a diverse and fast-paced environment to support, maintain and advance the capabilities of the unified data platform. This a global role that requires partnering with the broader JLLT team at the country, regional and global level by utilizing in-depth knowledge of cloud infrastructure technologies and platform engineering experience. Responsibilities Working with the application teams to prioritize new requests for functionality. Specifically, new user-facing functionality (e.g., the ability to ingest IoT data, subscription-based consumption, etc.) Addressing internal functionality (e.g., monitoring and alerting based on application performance, automated testing frameworks, etc.) Managing respective support queues (e.g., Ingest, Prepare, Storage and Consume, etc.) Note: agreed upon SLAs will be established post burn-in period Manage backlog via effective sprint planning based on feedback from the application teams. Mentoring and coaching the application teams on tools, technology and design patterns. Ensuring that the production environment is well built and that there is a clear escalation path for production issues Ensuring solution architecture meets JLL's requirements including, but not limited to, those regarding cloud spend, scalability, performance, etc. Developing infrastructure that is scalable, reliable, and monitored. Building a relationship with Cloud providers, to take advantage of their most appropriate technology offerings. Collaborating with the application team leads to ensure that the application teams' needs are met through the CI/CD framework, component monitoring and stats, incident escalation. Lead teams for a discovery and architecture workshop, influence client architects, and IT personnel Guide other architects working with you in the team. Adapt communications and approaches to conclude technical scope discussions with various Partners, resulting in Common Agreements. Deliver an optimized infrastructure services design leveraging public, private, and hybrid Cloud architectures and services Act as subject matter and implementation expert for the client as related to technical architecture and implementation of proposed solution using Cloud Services Inculcating "infrastructure as code" mentality in the Platform team overall. Create and maintain incident management requests to product group/engineering group. Analyse Complex application landscapes, anticipate potential problems and future trends, assess potential solutions, Impacts, and risks to propose cloud roadmap & solution architecture Develop and implement cloud architecture solutions based on AWS/Azure/GCP Cloud when assigned to work on delivery projects. Analyse client requirements, propose for overall Application modernization, migrations and green field implementations Experience in implementing and deploying a DevOps based, end to end cloud application. Sounds like you? To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Worked with Cloud delivery team to provide technical solutions services roadmap for customer. Knowledge on creating IaaS and PaaS cloud solutions in Azure Platform that meet customer needs for scalability, reliability, and performance Technical Skills & Competencies Must have experience in application development on Azure Cloud. Must possess business level understanding of enterprise application systems to drive innovations and transformations. Should have very good understanding of Cloud Native services and how they ascribe to application requirements. Minimum of 3-5 years of relevant experience with, API ingestion, file ingestion, batch transformation, metadata management, monitoring, pub/sub consumption, RDBMS ingestion and real-time transformation. Minimum of 3-5 years using the following technology or equivalent:GithubActionns, Azure DevOps, Azure Functions, Azure Batch using Python, C#, or NodeJS, Azure APIM , Azure Event Hub, Azure Data Lake Storage (Gen 2), Azure Monitor, Azure Table Storage, Azure Databricks, Azure SQL Database, Azure Search, Azure Cosmo Data Store, and Azure SignalR. Work with infrastructure team and deploy applications on cloud using blue green or brown field deployments. Ability to provide holistic and right scale cloud solutions that addresses scalability, availability, service continuity (DR), and performance and security requirements. Help customers by supporting scalable and highly available applications leveraging cloud services. Scripting and Automation skills using CLI, Python, PowerShell Clear understanding of IAM roles and policies and how to attach them to business entities and users. Provide deep development knowledge with respect to cloud architecture, design patterns. Design understanding and experience in RDBMS, NoSQL and RDS. Exposure to PaaS technologies and Containers like Docker, Kubernetes. Should understand costing oerent cloud services. Should have experience with Azure Cloud Infrastructure. Should have experience with CI CD tools such as Azure Devops, GitHub, Github Actions. Understanding of Application architecture and Enterprise Architecture is a must. What we can do for you: Youll join an entrepreneurial, inclusive culture. One where we succeed together across the desk and around the globe. Where like-minded people work naturally together to achieve great things. Our Total Rewards program reflects our commitment to helping you achieve your ambitions in career, recognition, well-being, benefits and pay. Join us to develop your strengths and enjoy a fulfilling career full of varied experiences. Keep those ambitions in sights and imagine where JLL can take you... Apply today! Location: On-site Bengaluru, KA Scheduled Weekly Hours: 40 If this job description resonates with you, we encourage you to apply even if you dont meet all of the requirements. Were interested in getting to know you and what you bring to the table! JLL Privacy Notice Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLLs recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely. For more information about how JLL processes your personal data, please view our . For additional details please see our career site pages for each country. For candidates in the United States, please see a full copy of our Equal Employment Opportunity and Affirmative Action policy . Jones Lang LaSalle (JLL) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process including the online application and/or overall selection process you may contact us at . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our page > I want to work for JLL.

Posted 6 days ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

vadodara

Work from Office

As a Senior Data Engineer at JLL Technologies, you will: Design, Architect, and Develop solutions leveraging cloud big data technology to ingest, process and analyze large, disparate data sets to exceed business requirements Develop systems that ingest, cleanse and normalize diverse datasets, develop data pipelines from various internal and external sources and build structure for previously unstructured data Interact with internal colleagues and external professionals to determine requirements, anticipate future needs, and identify areas of opportunity to drive data development Develop good understanding of how data will flow & stored through an organization across multiple applications such as CRM, Broker & Sales tools, Finance, HR etc Unify, enrich, and analyze variety of data to derive insights and opportunities Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate Develop data lake solution to store structured and unstructured data from internal and external sources and provide technical guidance to help migrate colleagues to modern technology platform Contribute and adhere to CI/CD processes, development best practices and strengthen the discipline in Data Engineering Org Mentor other members in the team and organization and contribute to organizations growth. What we are looking for: 6+ years work experience and bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Hands-on engineer who is curious about technology, should be able to quickly adopt to change and one who understands the technologies supporting areas such as Cloud Computing (AWS, Azure(preferred), etc.), Micro Services, Streaming Technologies, Network, Security, etc. 3 or more years of active development experience as a data developer using Python-spark, Spark Streaming, Azure SQL Server, Cosmos DB/Mongo DB, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities Build, test and enhance data curation pipelines integration data from wide variety of sources like DBMS, File systems, APIs and streaming systems for various KPIs and metrics development with high data quality and integrity Maintain the health and monitoring of assigned data engineering capabilities that span analytic functions by triaging maintenance issues; ensure high availability of the platform; monitor workload demands; work with Infrastructure Engineering teams to maintain the data platform; serve as an SME of one or more application Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams 3+ years of experience working with source code control systems and Continuous Integration/Continuous Deployment tools Independent and able to manage, prioritize & lead workload What you can expect from us: Our Total Rewards program reflects our commitment to helping you achieve your ambitions in career, recognition, well-being, benefits and pay. Join us to develop your strengths and enjoy a fulfilling career full of varied experiences. Keep those ambitions in sights and imagine where JLL can take you...

Posted 6 days ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

bengaluru

Work from Office

What this job involves: About the role The JLL Digital Products team aims to bring successful technology-based products to market in a high-growth environment. The teams mission is focused on accelerating technology adoption in commercial real estate by bringing creative, innovative and technical solutions to solve large, complex problems for our clients. The Senior Software Engineer is a key position on the Digital Products team, responsible for defining the key framework and implementing various components for JLLs Products Responsibilities This is an individual contributor role that is responsible for the designing and developing of Product features that are strategic for the business and built on the latest technologies and patterns. This a global role that requires partnering with both the global engineering teams and Product team. Design and implement high-performance, scalable, and maintainable software solutions using Java and Spring Boot. Develop and optimize GraphQL APIs to ensure efficient data querying and manipulation across our platforms. Work with big data technologies, particularly Databricks and PySpark, to process and analyze large datasets Collaborate with product managers and UX designers to translate business requirements into technical specifications. Write clean, maintainable, and well-documented code Participate in code reviews and contribute to improving our development processes Troubleshoot, debug, and upgrade existing systems Stay abreast of emerging technologies and industry trends in Protech and software development. Evaluate new technologies and frameworks that could benefit our products and services. Contribute to the company''s technology roadmap by proposing innovative solutions to complex problems. Interfaces with internal colleagues and external professionals to determine requirements, anticipate future needs, and identify areas of opportunity to drive data development Develop good understanding of how data will flow & stored through an organization across multiple applications such as CRM, Broker & Sales, Finance, HR, MDM, ODS, Data Lake, & EDW Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate Sounds like you To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Hands-on engineering lead who is curious about technology, should be able to quickly adopt to change and one who understands the technologies supporting areas such as Cloud Computing (AWS, Azure(preferred), etc.), Micro Services, Streaming Technologies, Network, Security, etc. Technical Skills & Competencies 7+ years of experience as a Software developer using Java, Springboot, GraphQL, Python-spark, Databricks, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Hands-on experience of building Data Pipelines in Cloud and well versed with CICD and DevOps process. Experience with React for front-end development is a plus. Familiarity with event-driven architectures and message queues (e.g., Kafka, RabbitMQ) Experience of handling un-structured data, working in a data lake environment, leveraging data streaming and developing data pipelines driven by events/queues Experience building and maintaining a data warehouse/ data lake in a production environment with efficient ETL design, implementation, and maintenance Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams.

Posted 1 week ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

bengaluru

Hybrid

Responsibilities This is an individual contributor role that is responsible for the designing and developing of data solutions that are strategic for the business and built on the latest technologies and patterns. This a global role that requires partnering with both the regional IT teams as well as the other global engineering teams. Contribute to the design of information infrastructure, and data management processes to move the organization to a more sophisticated, agile and robust target state data architecture Develop systems that ingest, cleanse and normalize diverse datasets, develop data pipelines from various internal and external sources and build structure for previously unstructured data Interfaces with internal colleagues and external professionals to determine requirements, anticipate future needs, and identify areas of opportunity to drive data development Develop good understanding of how data will flow & stored through an organization across multiple applications such as CRM, Broker & Sales, Finance, HR, MDM, ODS, Data Lake, & EDW Develop data solutions that enable non-technical staff to make data-driven decisions Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate Develop data lake solutions to store structured and unstructured data from internal and external sources and provide technical guidance to help migrate colleagues to modern technology platform Sounds like you To apply you need to be: Experience & Education Bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Hands-on engineering lead who is curious about technology, should be able to quickly adopt to change and one who understands the technologies supporting areas such as Cloud Computing (AWS, Azure(preferred), etc.), Micro Services, Streaming Technologies, Network, Security, etc. Technical Skills & Competencies 6+ years of experience as a data developer using Python-spark, Databricks, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Hands-on experience on Cube.js for building the Semantic Layer for the data platform. Hands-on experience of building Data Pipelines in Cloud and well versed with CICD and DevOps process. Experience of handling un-structured data, working in a data lake environment, leveraging data streaming and developing data pipelines driven by events/queues Experience building and maintaining a data warehouse/ data lake in a production environment with efficient ETL design, implementation, and maintenance Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams

Posted 1 week ago

Apply

4.0 - 7.0 years

4 - 7 Lacs

bengaluru, karnataka, india

On-site

Key Responsibilities: Design & Development: Contribute to the design of information infrastructure and data management processes. Build sophisticated, agile, and robust target-state data architecture. Develop systems for ingesting, cleansing, and normalizing diverse datasets. Develop data pipelines from various internal and external sources. Build structures for previously unstructured data. Design and develop data management and data persistence solutions. Collaboration & Influence: Collaborate with JLLT teams globally, leveraging knowledge of data, infrastructure, and technologies. Work with internal teams (CRM, Broker & Sales tools, Finance, HR) to understand how data flows and is stored. Develop proof of concepts (POCs) to influence platform architects and product managers. Drive data migration and solution validation. What We're Looking For: Experience: 4+ years of overall work experience. 3+ years of experience as a data developer with tools such as Python, Kafka, Spark Streaming, Azure SQL Server, Cosmos DB, Mongo DB, Azure Event Hubs, Azure Data Lake Storage, Azure Search . Skills: Strong technical, analytical, and organizational skills. Effective written and verbal communication skills, including technical writing. Experience working in Cloud computing (AWS, Azure preferred), Microservices, and Streaming Technologies. Hands-on experience building data pipelines in Cloud. Experience handling unstructured data and working in a data lake environment. Ability to work with SQL databases and develop solutions that improve data processing capabilities. Attributes: Team player, reliable, self-motivated, and capable of handling multiple projects simultaneously in a fast-paced environment. A strong sense of curiosity and adaptability to new technologies. Self-disciplined and capable of driving projects independently while collaborating with cross-functional teams.

Posted 1 week ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

bengaluru, karnataka, india

On-site

Key Responsibilities: Design & Development: Contribute to the design of information infrastructure and data management processes. Build sophisticated, agile, and robust target-state data architecture. Develop systems for ingesting, cleansing, and normalizing diverse datasets. Develop data pipelines from various internal and external sources. Build structures for previously unstructured data. Design and develop data management and data persistence solutions. Collaboration & Influence: Collaborate with JLLT teams globally, leveraging knowledge of data, infrastructure, and technologies. Work with internal teams (CRM, Broker & Sales tools, Finance, HR) to understand how data flows and is stored. Develop proof of concepts (POCs) to influence platform architects and product managers. Drive data migration and solution validation. What We're Looking For: Experience: 4+ years of overall work experience. 3+ years of experience as a data developer with tools such as Python, Kafka, Spark Streaming, Azure SQL Server, Cosmos DB, Mongo DB, Azure Event Hubs, Azure Data Lake Storage, Azure Search . Skills: Strong technical, analytical, and organizational skills. Effective written and verbal communication skills, including technical writing. Experience working in Cloud computing (AWS, Azure preferred), Microservices, and Streaming Technologies. Hands-on experience building data pipelines in Cloud. Experience handling unstructured data and working in a data lake environment. Ability to work with SQL databases and develop solutions that improve data processing capabilities. Attributes: Team player, reliable, self-motivated, and capable of handling multiple projects simultaneously in a fast-paced environment. A strong sense of curiosity and adaptability to new technologies. Self-disciplined and capable of driving projects independently while collaborating with cross-functional teams.

Posted 1 week ago

Apply

3.0 - 7.0 years

14 - 18 Lacs

pune

Work from Office

We are looking for a passionate Talent Acquisition Partner to be a part of the Talent Acquistion Team of India partnering closely with the business which is comprised of many exciting business units across Sales, Service, Operations, IT and Corporate functions, As a Talent Acquisition Partner, you will build and execute hiring strategies to identify and attract diverse candidates by collaborating with your broader HR and Talent Acquisition colleagues, as well as with hiring managers, to understand the hiring needs, priorities and critically assess talent You will build strong, respectful relationships and help elevate our employer brand to a highly competitive level By measuring the impact of our recruiting methods you will drive for continuous recruiting process improvement, Responsibilities Build a deep understanding of the business and relevant markets to help shape the talent acquisition strategy Working closely with hiring managers and other stakeholders to create and operate a streamlined and unbiased recruitment process to ensure we are making high quality hiring decisions in a timely manner Owning the recruitment journey and all aspects of the candidate experience sourcing, screening, and guiding great people through the entire process Always striving to improve the above, identify trends and leverage knowledge, data, and competitive intelligence to develop creative, customized sourcing/hiring strategies, consistently ensuring a healthy pipeline of relevant talent Role model recruiting and sourcing best practices, and help colleagues achieve their career aspirations through mentorship and coaching Drive mastery of interview and assessment techniques within your team and across stakeholder groups including hiring for potential Monitoring and reporting on key talent acquisition metrics, establishing accountability and using data to influence decisions Define, design, and implement talent acquisition programs including employer branding, talent mapping and pipelining, and events among other initiatives Partner with key resources to incorporate Inclusion & Diversity into all our recruiting efforts Participate & lead campus recruitment and walk in drives (as & when needed) Ability to participate & lead compensation negotiation processes with candidates, Requirements 8 to 10 years of experience in progressive industries, preferably multi national work environment, Strong track record of achievement in Talent Acquisition including end-to-end recruitment and a deep knowledge of sourcing Comfortable with change and happy to take an agile approach to work Experience delivering large projects, process improvements and managing through change, Customer focused and able to build great relationships both virtual and in-person Exceptional communication and collaboration skills, and used to working with cross-functional teams in a matrix environment Results oriented with a can-do, positive attitude towards their work and the people around them Process mindset and ability taking new initiatives from idea to execution, Real appreciation for data and metrics with an ability to convert them into actionable activities and continuous improvement efforts Experience with leading Employer Branding activation and Inclusion & Diversity initiatives would be beneficial

Posted 2 weeks ago

Apply

7.0 - 9.0 years

11 - 16 Lacs

gurugram

Work from Office

Role Description: As a Technical Lead - Data Science and Modeling at Incedo, you will be responsible for developing and deploying predictive models and machine learning algorithms to support business decision-making. You will work with data scientists, data engineers, and business analysts to understand business requirements and develop data-driven solutions. You will be skilled in programming languages such as Python or R and have experience in data science tools such as TensorFlow or Keras. You will be responsible for ensuring that models are accurate, efficient, and scalable. Proven experience with Azure cloud services, especially Azure OpenAI and Azure Search. Strong proficiency in Java, Java Sprint frameworks, Python, Maven, GitLab, GitHub Copilot, IntelliJ, VS Code, Postgres RDBMS, vector databases like ChromaDB, Azure Search and optionally ReactJS Experience implementing RAG (Retrieval-Augmented Generation) patterns in real-world applications. Familiarity with microservices and API development. Knowledge of data security, privacy, and compliance best practices. Design and implement robust testing and validation strategies for AI solutions, including unit, integration, and end-to-end tests for chat interfaces and AI-driven features. Technical Skills Develop automated test suites to ensure accuracy, reliability, and performance of AI models and chat interactions. Integrate the application with AI data models via secure, scalable APIs, ensuring seamless data flow and real-time interaction between the application and AI services. Validate AI outputs for correctness, relevance, and compliance with business requirements. Understand and apply principles of Responsible AI, including fairness, transparency, explainability, and privacy. Ensure compliance with ethical standards and regulatory requirements in AI development and deployment. Document testing procedures, validation results, and responsible AI practices for audit and continuous improvement. Experience with automated testing, CI/CD pipelines, and DevOps practices. Excellent problem-solving, communication, and collaboration skills. Azure Storage / Azure Deployments / Azure Data Factory, Database Skills - SQL Server and proficiency in Python would be good Nice-to-have skills Qualifications Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a part of the data and analytics engineering team at PwC, your focus will be on utilizing advanced technologies and techniques to create robust data solutions for clients. Your role will involve transforming raw data into actionable insights, enabling informed decision-making, and contributing to business growth. Specifically in data engineering at PwC, you will be responsible for designing and constructing data infrastructure and systems that facilitate efficient data processing and analysis. This will include the development and implementation of data pipelines, data integration, and data transformation solutions. At PwC - AC, we are seeking an Azure Manager specializing in Data & AI, with a strong background in managing end-to-end implementations of Azure Databricks within large-scale Data & AI programs. In this role, you will be involved in architecting, designing, and deploying scalable and secure solutions that meet business requirements, encompassing ETL, data integration, and migration. Collaboration with cross-functional, geographically dispersed teams and clients will be key to understanding strategic needs and translating them into effective technology solutions. Your responsibilities will span technical project scoping, delivery planning, team leadership, and ensuring the timely execution of high-quality solutions. Utilizing big data technologies, you will create scalable, fault-tolerant components, engage stakeholders, overcome obstacles, and stay abreast of emerging technologies to enhance client ROI. Candidates applying for this role should possess 8-12 years of hands-on experience and meet the following position requirements: - Proficiency in designing, architecting, and implementing scalable Azure Data Analytics solutions utilizing Azure Databricks. - Expertise in Azure Databricks, including Spark architecture and optimization. - Strong grasp of Azure cloud computing and big data technologies. - Experience in traditional and modern data architecture and processing concepts, encompassing relational databases, data warehousing, big data, NoSQL, and business analytics. - Proficiency in Azure ADLS, Data Databricks, Data Flows, HDInsight, and Azure Analysis services. - Ability to build stream-processing systems using solutions like Storm or Spark-Streaming. - Practical knowledge of designing and building Near-Real Time and Batch Data Pipelines, expertise in SQL and Data modeling within an Agile development process. - Experience in the architecture, design, implementation, and support of complex application architectures. - Hands-on experience implementing Big Data solutions using Microsoft Data Platform and Azure Data Services. - Familiarity with working in a DevOps environment using tools like Chef, Puppet, or Terraform. - Strong analytical and troubleshooting skills, along with proficiency in quality processes and implementation. - Excellent communication skills and business/domain knowledge in Financial Services, Healthcare, Consumer Market, Industrial Products, Telecommunication, Media and Technology, or Deal advisory. - Familiarity with Application DevOps tools like Git, CI/CD Frameworks, Jenkins, or Gitlab. - Good understanding of Data Modeling and Data Architecture. Certification in Data Engineering on Microsoft Azure (DP 200/201/203) is required. Additional Information: - Travel Requirements: Travel to client locations may be necessary based on project needs. - Line of Service: Advisory - Horizontal: Technology Consulting - Designation: Manager - Location: Bangalore, India In addition to the above, the following skills are considered advantageous: - Cloud expertise in AWS, GCP, Informatica-Cloud, Oracle-Cloud. - Knowledge of Cloud DW technologies like Snowflake and Databricks. - Certifications in Azure Databricks. - Familiarity with Open Source technologies such as Apache Spark, Hadoop, NoSQL, Kafka, and Solr/Elastic Search. - Data Engineering skills in Java, Python, Pyspark, and R-Programming. - Data Visualization proficiency in Tableau and Qlik. Education qualifications accepted include BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA.,

Posted 4 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

haryana

On-site

As an Azure AI Engineer, you will be responsible for leveraging your expertise in Azure cloud services, particularly Azure OpenAI and Azure Search. Your role will involve working with Python, GitLab, GitHub Copilot, VS Code, Postgres RDBMS, and vector databases like ChromaDB and Azure Search. Additionally, you will have the opportunity to showcase your skills in Java, Java Sprint frameworks, maven, IntelliJ, and ReactJS. Your experience implementing RAG (Retrieval-Augmented Generation) patterns in real-world applications will be invaluable. You should also be familiar with microservices and API development, ensuring the seamless integration of applications with AI data models through secure, scalable APIs. In this role, you will design and implement robust testing and validation strategies for AI solutions, including unit, integration, and end-to-end tests for chat interfaces and AI-driven features. Developing automated test suites to ensure the accuracy, reliability, and performance of AI models and chat interactions will be a key aspect of your responsibilities. You will be expected to validate AI outputs for correctness, relevance, and compliance with business requirements. Understanding and applying principles of Responsible AI, such as fairness, transparency, explainability, and privacy, will be crucial. Compliance with ethical standards and regulatory requirements in AI development and deployment is essential. Documenting testing procedures, validation results, and responsible AI practices for audit and continuous improvement will be part of your routine tasks. Your experience with automated testing, CI/CD pipelines, and DevOps practices will be highly beneficial in this role. Preferred qualifications include experience with telemetry analytics, observability platforms, or similar domains, as well as familiarity with AI/ML model deployment and MLOps in Azure. This position is based in Gurgaon with a hybrid work model of 4 days in the office and no ODC. The grade and experience level for this role are Lead/Senior Lead with 6 to 10 years of experience. The notice period options are immediate, 30 days, or 60 days. If you possess excellent problem-solving, communication, and collaboration skills, and are ready to take on the challenges of implementing AI solutions in a responsible and compliant manner, we encourage you to apply for this exciting opportunity.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Full-stack Manager at our organization, you will be responsible for leading a team with a strong focus on .Net Core / ASP. Net Core, ReactJS, and Database technologies. Your role will involve a mix of people management and technical responsibilities, requiring a minimum of 5-6 years of experience in Asp.Net Core and 3-4 years of experience in React/React.Js. In this position, 50% of your time will be dedicated to people management tasks such as managing direct reports, conducting performance appraisals, and overseeing the development of team members. The remaining 50% will involve hands-on technical work in coding and development. Your technical expertise should include server-side development using ASP .NET Core, client-side development with ReactJS, and proficiency in SQL or NoSQL Databases such as SQL Server, Azure CosmosDB, or MongoDB. Experience with Rest API, GraphQL, Elastic Search or Azure Search, API Management Platforms like Apigee or Azure APIM, working with large datasets, developing re-usable frameworks, and familiarity with cloud native application architecture patterns will be advantageous. Additionally, you should have a solid understanding of architecting applications in Microsoft Azure, API design concepts, developing RESTful web services, utilizing DevOps tools for CI/CD, following best practices for secure application development, and working with agile methodologies. As a Senior Full-stack Manager in the IT/Computers-Software industry, you will play a crucial role as a Senior Technical Architect. Key skills for this position include React, React.Js, .Net Core, Asp.Net Core, SQL, NoSQL, Technical Manager, and .Net Full Stack. A degree in B.Sc, B.Com, M.Sc, MCA, B.E, or B.Tech is required for this role. If you meet these qualifications and are looking for a challenging opportunity to lead a dynamic team in a fast-paced environment, we encourage you to apply for this position. Please send your resume to Resume.Augusta@augustainfotech.com.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Mandatory Certifications (Any 1) 1. AI-102 Microsoft Certified: Azure AI Engineer Associate. 2. DP-100 Microsoft Certified: Azure Data Scientist Associate. Good To Have 1. Azure Storage / Azure Deployments / Azure Data Factory 2. Database Skills - SQL Server Requirements: Proven experience with Azure cloud services, especially Azure OpenAI and Azure Search. Strong proficiency in Python , GitLab, GitHub Copilot, VS Code, Postgres RDBMS, vector databases like ChromaDB, Azure Search. Good to have - Java, Java Sprint frameworks, maven, IntelliJ, ReactJS. Experience implementing RAG (Retrieval-Augmented Generation) patterns in real-world applications. Familiarity with microservices and API development. Knowledge of data security, privacy, and compliance best practices. Design and implement robust testing and validation strategies for AI solutions, including unit, integration, and end-to-end tests for chat interfaces and AI-driven features. Develop automated test suites to ensure accuracy, reliability, and performance of AI models and chat interactions. Integrate the application with AI data models via secure, scalable APIs, ensuring seamless data flow and real-time interaction between the application and AI services. Validate AI outputs for correctness, relevance, and compliance with business requirements. Understand and apply principles of Responsible AI, including fairness, transparency, explainability, and privacy. Ensure compliance with ethical standards and regulatory requirements in AI development and deployment. Document testing procedures, validation results, and responsible AI practices for audit and continuous improvement. Experience with automated testing, CI/CD pipelines, and DevOps practices. Excellent problem-solving, communication, and collaboration skills. Preferred Qualifications: Experience with telemetry analytics, observability platforms, or similar domains. Familiarity with AI/ML model deployment and MLOps in Azure. Knowledge of banking, payments, or financial services applications Locations: - Gurgaon, Hybrid (4 Days), No ODC Grade, Experience: - Lead/Senior Lead & 6 Y 10 Y Notice Period: - look for immediate, 30 Days, 60 Days. Show more Show less

Posted 1 month ago

Apply

5.0 - 20.0 years

0 Lacs

karnataka

On-site

As a Senior Manager, Full-Stack in our company located in Bangalore, you should have a solid background in .Net Core / ASP. Net Core, ReactJS, and Database technologies. It is essential that you possess a minimum of 5-6 years of experience with Asp.Net Core and 3-4 years of experience with React/React.Js. In this role, you will be responsible for both people management (50 PERCENT) and technical duties (50 PERCENT), which include managing direct reports, conducting performance appraisals, and engaging in coding and development tasks. Your expertise should cover server-side development using ASP .NET Core, client-side development with ReactJS, and familiarity with SQL or NoSQL databases such as SQL Server, Azure CosmosDB, or MongoDB. Additionally, experience with Rest API, GraphQL, Elastic Search or Azure Search, API Management Platforms like Apigee or Azure APIM, working with large datasets, developing reusable frameworks, and knowledge of cloud native application architecture patterns are highly desirable. As part of the role, you should be well-versed in architecting applications in Microsoft Azure, designing APIs, developing RESTful web services, utilizing DevOps Tools for CI/CD, implementing secure application development practices, and working in agile environments. The ideal candidate will have a background in IT/Computers-Software industry, previous experience as a Senior Technical Architect, and possess key skills in React, React.Js, .Net Core, Asp.Net Core, SQL, NoSQL, Technical Management, and Full Stack development. A Bachelor's or Master's degree in relevant fields such as B.Sc, B.Com, M.Sc, MCA, B.E, or B.Tech is required for this position. If you believe you have the skills and experience we are looking for, please send your resume to Resume.Augusta@augustainfotech.com.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Full Stack Developer, you will be responsible for leading the development of our React/.NET(core) based web application while ensuring synchronization with our backend Azure SQL Server and Cosmos databases. Your responsibilities will include leading the development of our React/.NET(core) -based online application, ensuring synchronization with NO SQL database preferably COSMOS DB and our Azure SQL Server on the backend. You will also develop and maintain scalable and secure full-stack web applications, build and maintain RESTful APIs, optimize existing APIs for maximum performance, ensure proper testing and deployment of all web applications, and maintain and update development roadmaps and timelines. Additionally, you will work in a fast-paced Agile development environment. To be successful in this role, you should have 3-5 years of experience in web programming and software development. Hands-on experience in Data Structures, Object-Oriented Programming, Architectures like REST, Json APIs, Microservices, Solid Principles & Design Patterns, as well as at least 1 project implementation as a Full Stack Developer is a must. You should also have experience in developing web applications and user interfaces using technologies such as ASP.NET Core (C#), Entity Framework/ADO.NET, SQL, Web API & Json services, Angular.js, React/Redux, React Server-Side Rendering (SSR), CSS (LESS, SASS, etc.), front-end frameworks, Object-Oriented JavaScript, and JavaScript. Strong knowledge of Dependency Injection, IoC Containers, and Life Cycles is required. Experience with tools like Azure Devops, GIT version tracking, branching/merging, Visual Studio or Visual Studio Code, and Postman is also necessary. Proven experience in Azure platform services including Azure Web/API Apps, Azure Functions, Azure Service Bus, Azure Cosmos DB, Azure SQL, Azure Redis, Azure Search, and Azure Data Factory is required. Experience working with No-SQL databases like Azure Cosmos DB, Cloud technologies, Queue, Pub/Sub, Cache mechanisms, data integration, and 3rd party APIs is preferred. You should also have experience in application performance monitoring, profiling applications, and detecting/fixing bottlenecks. Familiarity with JIRA and Agile (Scrum & Kanban) practices is beneficial. Strong problem-solving skills, attention to detail, and the ability to work independently are essential, along with good written and verbal communication skills.,

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

As a Senior Data Engineer at JLL Technologies, you will: Design, Architect, and Develop solutions leveraging cloud big data technology to ingest, process and analyze large, disparate data sets to exceed business requirements Develop systems that ingest, cleanse and normalize diverse datasets, develop data pipelines from various internal and external sources and build structure for previously unstructured data Interact with internal colleagues and external professionals to determine requirements, anticipate future needs, and identify areas of opportunity to drive data development Develop good understanding of how data will flow & stored through an organization across multiple applications such as CRM, Broker & Sales tools, Finance, HR etc Unify, enrich, and analyze variety of data to derive insights and opportunities Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities Develop POCs to influence platform architects, product managers and software engineers to validate solution proposals and migrate Develop data lake solution to store structured and unstructured data from internal and external sources and provide technical guidance to help migrate colleagues to modern technology platform Contribute and adhere to CI/CD processes, development best practices and strengthen the discipline in Data Engineering Org Mentor other members in the team and organization and contribute to organizations growth. What we are looking for: 6+ years work experience and bachelors degree in Information Science, Computer Science, Mathematics, Statistics or a quantitative discipline in science, business, or social science. Hands-on engineer who is curious about technology, should be able to quickly adopt to change and one who understands the technologies supporting areas such as Cloud Computing (AWS, Azure(preferred), etc.), Micro Services, Streaming Technologies, Network, Security, etc. 3 or more years of active development experience as a data developer using Python-spark, Spark Streaming, Azure SQL Server, Cosmos DB/Mongo DB, Azure Event Hubs, Azure Data Lake Storage, Azure Search etc. Design & develop data management and data persistence solutions for application use cases leveraging relational, non-relational databases and enhancing our data processing capabilities Build, test and enhance data curation pipelines integration data from wide variety of sources like DBMS, File systems, APIs and streaming systems for various KPIs and metrics development with high data quality and integrity Maintain the health and monitoring of assigned data engineering capabilities that span analytic functions by triaging maintenance issues; ensure high availability of the platform; monitor workload demands; work with Infrastructure Engineering teams to maintain the data platform; serve as an SME of one or more application Team player, Reliable, self-motivated, and self-disciplined individual capable of executing on multiple projects simultaneously within a fast-paced environment working with cross functional teams 3+ years of experience working with source code control systems and Continuous Integration/Continuous Deployment tools Independent and able to manage, prioritize & lead workload What you can expect from us: Our Total Rewards program reflects our commitment to helping you achieve your ambitions in career, recognition, well-being, benefits and pay. Join us to develop your strengths and enjoy a fulfilling career full of varied experiences. Keep those ambitions in sights and imagine where JLL can take you...

Posted 2 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies