Jobs
Interviews

3315 Big Data Jobs - Page 40

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture to support data initiatives.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality.- Strong understanding of data integration techniques and ETL processes.- Experience with data quality assessment and improvement methodologies.- Familiarity with data governance principles and best practices.- Ability to work with large datasets and perform data cleansing. Additional Information:- The candidate should have minimum 3 years of experience in Informatica Data Quality.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

2.0 - 7.0 years

20 - 25 Lacs

Bengaluru

Work from Office

The Group You ll Be A Part Of The Global Operations Group brings information systems, facilities, supply chain, logistics, and high-volume manufacturing together to drive the engine of our global business operations. We help Lam deliver industry-leading solutions with speed and efficiency, while actively supporting the resilient and profitable growth of Lams business. The Impact You ll Make Join Lam as a Data Scientist, where youll design, develop, and program methods to analyze unstructured and diverse big data into actionable insights. Youll develop algorithms and automated processes to evaluate large data sets from disparate sources. Your expertise in generating, interpreting, and communicating actionable insights enables Lam to make informed and data-driven decisions. What You ll Do Who We re Looking For Typically requires a Bachelor s degree and a minimum of 2 years of related experience; or an advanced degree without experience; or equivalent work experience. Preferred Qualifications Our Commitment We believe it is important for every person to feel valued, included, and empowered to achieve their full potential. By bringing unique individuals and viewpoints together, we achieve extraordinary results. Lam Research ("Lam" or the "Company") is an equal opportunity employer. Lam is committed to and reaffirms support of equal opportunity in employment and non-discrimination in employment policies, practices and procedures on the basis of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex (including pregnancy, childbirth and related medical conditions), gender, gender identity, gender expression, age, sexual orientation, or military and veteran status or any other category protected by applicable federal, state, or local laws. It is the Companys intention to comply with all applicable laws and regulations. Company policy prohibits unlawful discrimination against applicants or employees. Lam offers a variety of work location models based on the needs of each role. Our hybrid roles combine the benefits of on-site collaboration with colleagues and the flexibility to work remotely and fall into two categories On-site Flex and Virtual Flex. On-site Flex you ll work 3+ days per week on-site at a Lam or customer/supplier location, with the opportunity to work remotely for the balance of the week. Virtual Flex you ll work 1-2 days per week on-site at a Lam or customer/supplier location, and remotely the rest of the time.

Posted 3 weeks ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Mumbai, Mangaluru

Hybrid

6 months-3 yrs of IT experience Knowledge on Bigquery, SQL Or similar tools Aware of ETL and Data warehouse concepts Good oral and written communication skills Great team player and able to work efficiently with minimal supervision Should have good knowledge of Java or python to conduct data cleansing Preferred: Good communication and problem-solving skills Experience on Spring Boot would be an added advantage Apache Beam developer with Google Cloud BigTable and Google BigQuery is desirable Experience in Google Cloud Platform (GCP) Skills in writing batch and stream processing jobs using Apache Beam Framework (Dataflow) Knowledge of Microservices, Pub/Sub, Cloud Run, Cloud Function Roles and Responsibilities Develop high performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on GCP cloud using GCP data stores such as BigQuery Optimizing data pipelines for performance and cost for large scale data lakes. Writing complex, highly-optimized queries across large data sets and to create data processing layers. Closely interact with Data Engineers to identify right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs and other Data/ML engineers Research new use cases for existing data.

Posted 4 weeks ago

Apply

8.0 - 12.0 years

6 - 10 Lacs

Gurugram, Bengaluru

Work from Office

Our teams are driven by the purpose providing exceptional travel experience for our customers. We have continuously stayed ahead of the curve by developing our technology and products to meet the ever-changing demands of the rapidly evolving travel ecosystem. Currently we are solving the below challenging problems: How do we leverage big data to provide a truly personalized experience to each of our users? How do we leverage AI to innovate our products and deliver best in class end-to-end experience to all our users? How to bring the next 100 million users to our platform? If this excites you, join us, for a rewarding, fulfilling and enriching career. What youll be doing: Accurately estimate and implement machine learning models. Build pipelines for feature engineering by integrating data from different data sources. Yo ull meticulously develop and deploy production data pipelines (batch + real time) Architect and design scalable data services that are consumed by millions of users. Skillful implement performance engineering of big data systems. What youll bring to the table: Youll bring the compulsory and essential experience of working with distributed systems software development. Youll bring demonstrated experience of production experience in big data infrastructure and data modeling. Youll bring critical experience performance optimization for both data loading and data retrieval. Know-how of Spark/Kafka. Invaluable knowledge in AWS Deployment, Docker and Kubernetes.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Role : Data Architect Location : Remote Experience : 8 - 10+ Years Employment Type : Full-Time, Permanent We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Jaipur

Work from Office

Role : Data Architect Location : Remote Experience : 8 - 10+ Years Employment Type : Full-Time, Permanent We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Kanpur

Remote

Role : Data Architect Location : Remote Experience : 8 - 10+ Years Employment Type : Full-Time, Permanent We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Chennai

Remote

Role : Data Architect Location : Remote Experience : 8 - 10+ Years Employment Type : Full-Time, Permanent We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Ludhiana

Remote

Role : Data Architect Location : Remote Experience : 8 - 10+ Years Employment Type : Full-Time, Permanent We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Nashik

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Pune

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Thane

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Mumbai

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

13 - 17 Lacs

Ahmedabad

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

1.0 - 3.0 years

13 - 17 Lacs

Lucknow

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

1.0 - 3.0 years

13 - 17 Lacs

Visakhapatnam

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

2.0 - 5.0 years

13 - 17 Lacs

Hyderabad

Remote

We are seeking a highly experienced and skilled Data Architect to join our dynamic team. In this pivotal role, you will work directly with our enterprise customers, leveraging your expertise in the Adobe Experience Platform (AEP) to design and implement robust data solutions. The ideal candidate will possess a profound understanding of data modeling, SQL, ETL processes, and customer data architecture, with a proven track record of delivering impactful results in large-scale environments. About the Role : As a Data Architect specializing in AEP, you will be instrumental in transforming complex customer data into actionable insights. You will serve as a technical expert, guiding clients through the intricacies of data integration, standardization, and activation within the Adobe ecosystem. This is an exciting opportunity to lead projects, innovate solutions, and make a significant impact on our clients' data strategies. Key Responsibilities : - Interface directly with Adobe enterprise customers to meticulously gather requirements, design tailored data solutions, and provide expert architectural recommendations. - Foster seamless collaboration with both onshore and offshore engineering teams to ensure efficient and high-quality solution delivery. - Produce comprehensive and clear technical specification documents for the effective implementation of data solutions. - Design and structure sophisticated data models to enable advanced customer-level analytics and reporting. - Develop and implement robust customer ID mapping processes to create a unified and holistic customer view across diverse data sources. - Automate data movement, cleansing, and transformation processes using various scripting languages and tools. - Lead client calls and proactively manage project deliverables, ensuring timelines and expectations are met. - Continuously identify and propose innovative solutions to address complex customer data challenges and optimize data workflows. Requirements : - 10+ years of demonstrable experience in data transformation and ETL processes across large and complex datasets. - 5+ years of hands-on experience in data modeling (Relational, Dimensional, Big Data paradigms). - Profound expertise with SQL/NoSQL databases and a strong understanding of data warehouse concepts. - Proficiency with industry-leading ETL tools (e.g., Informatica, Iil, etc.). - Proven experience with reporting and business intelligence tools such as Tableau and Power BI. - In-depth knowledge of customer-centric datasets (e.g., CRM, Call Center, Marketing Automation platforms, Web Analytics). - Exceptional communication, time management, and multitasking skills, with the ability to articulate complex technical concepts clearly to diverse audiences. - Bachelor's or Masters degree in Computer Science, Data Science, or a closely related quantitative field.

Posted 4 weeks ago

Apply

3.0 - 8.0 years

0 - 3 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Need Immediate Joiners Responsibilities: Defines, designs, develops and test software components/applications using AWS - (Databricks on AWS, AWS Glue, Amazon S3, AWS Lambda, Amazon Redshift, AWS Secrets Manager) Strong SQL skills with experience. Experience handling Structured and unstructured datasets. Experience in Data Modeling and Advanced SQL techniques. Experience implementing AWS Glue, Airflow, or any other data orchestration tool using latest technologies and techniques. • Good exposure in Application Development. The candidate should work independently with minimal supervision. Must Have: Hands-on experience with distributed computing framework like Databricks, Spark-Ecosystem (Spark Core, PySpark, Spark Streaming, SparkSQL) Willing to work with product teams to best optimize product features/functions. Experience on Batch workloads and real-time streaming with high volume data frequency Performance optimization on Spark workloads • Environment setup, user management, Authentication and cluster management on Databricks Professional curiosity and the ability to enable yourself in new technologies and tasks. Good understanding of SQL and a good grasp of relational and analytical database management theory and practice. Key Skills: • Python, SQL and PySpark • Big Data Ecosystem (Hadoop, Hive, Sqoop, HDFS, HBase) • Spark Ecosystem (Spark Core, Spark Streaming, Spark SQL) / Databricks • AWS (AWS Glue, Databricks on AWS, Lambda, Amazon Redshift, Amazon S3, AWS Secrets Manager) • Data Modelling, ETL Methodology

Posted 4 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

Position Summary. Drives the execution of multiple business plans and projects by identifying customer and operational needs; developing and communicating business plans and priorities; removing barriers and obstacles that impact performance; providing resources; identifying performance standards; measuring progress and adjusting performance accordingly; developing contingency plans; and demonstrating adaptability and supporting continuous learning. Provides supervision and development opportunities for associates by selecting and training; mentoring; assigning duties; building a team-based work environment; establishing performance expectations and conducting regular performance evaluations; providing recognition and rewards; coaching for success and improvement; and ensuring diversity awareness. Promotes and supports company policies, procedures, mission, values, and standards of ethics and integrity by training and providing direction to others in their use and application; ensuring compliance with them; and utilizing and supporting the Open Door Policy. Ensures business needs are being met by evaluating the ongoing effectiveness of current plans, programs, and initiatives; consulting with business partners, managers, co-workers, or other key stakeholders; soliciting, evaluating, and applying suggestions for improving efficiency and cost-effectiveness; and participating in and supporting community outreach events. What you'll do. About Team Ever wondered what would a convergence of online and offline advertising systems looks like Ever wondered how we can bridge the gap between sponsored search, display, video ad formats Ever thought how we can write our own ad servers which serve billions of requests in near real time Our Advertising Technology team is building an end-to-end advertising platform that is key to Walmarts overall growth strategy. We use cutting edge machine learning, data mining and optimization algorithms to ingest, model and analyze Walmarts proprietary online and in-store data, encompassing 95% of American households. Importantly, we build smart data systems that deliver relevant retail ads and experiences that connect our customers with the brands and products they love. Your Opportunity We are looking for a versatile principal data scientist who has a strong expertise in machine learning, deep learning with good software engineering skills and significant exposure to building ML solutions; including Gen AI solutions scratch up & leading data science engagement. The opportunities that will come with this role are As a seasoned SME in MLE, you will get to work on & take a lead in scaling & deployment for the most challenging of our data science solutions (including , but definitely not limited to Gen-AI solutions) across a broad spectrum of advertising domain. Influence the best practices that we should follow as we scale & deploy our solutions across a diverse set of product. Train and mentor our pool of data scientists in data sciences and MLE skills Contribute to the Tech org via patents, publications & open source contributions. What You Will Do Design large-scale AI/ML products/systems impacting millions of customers Develop highly scalable, timely, highly-performant, instrumented, and accurate data pipelines Drive and ensure that MLOps practices are being followed in solutions Enable data governance practices and processes by being a passionate adopter and ambassador. Drive data pipeline efficiency, data quality, efficient feature engineering, maintenance of different DBs (like Vector DBs, Graph DBs, feature stores, caching mechanism) Lead and inspire a team of scientists and engineers solving AI/ML problems through R&D while pushing the state-of-the-art Lead the team to develop production-level code for the implementation of AI/ML solutions using best practices to handle high-scale and low-latency requirements Deploy batch and real-time ML solutions, model results consumption and integration pipelines. Work with the customer-centric mindset to deliver high-quality business-driven analytic solutions. Drive proactive optimisation of code and deployments, improving efficiency, cost and resource optimisation. Design model architecture, optimal Tech stack and model choices, integration with larger engineering ecosystem, drive best-practices of model integrations working closely with Software Engineering leaders Consult with business stakeholders regarding algorithm-based recommendations and be a thought-leader to deploy these & drive business actions. Closely partners with the Senior Managers & Director of Data Science, Engineering and product counterparts to drive data science adoption in the domain Collaborate with multiple stakeholders to drive innovation at scale Build a strong external presence by publishing your team's work in top-tier AI/ML conferences and developing partnerships with academic institutions Adhere to Walmart's policies, procedures, mission, values, standards of ethics and integrity Adopt Walmart's quality standards, develop/recommend process standards and best practices across the retail industry. Drive data pipeline efficiency, data quality, efficient feature engineering, maintenance of different DBs (like Vector DBs, Graph DBs, feature stores, caching mechanism). Deploy batch and real-time ML solutions, model results consumption and integration pipelines. Design model architecture, Optimal Tech stack and model choices, integration with larger engineering ecosystem, drive best-practices of model integrations working closely with Software Engineering leaders. Drive proactive optimisation of code and deployments, improving efficiency, cost and resource optimisation. What You Will Bring Bachelors with > 13 years or Master's with > 12 years OR Ph.D. with > 10 years of relevant experience. Educational qualifications should be in Engineering / Data sciences. Strong experience working with state-of-the-art supervised and unsupervised machine learning algorithms on real-world problems. Experienced in architecting solutions with Continuous Integration and Continuous Delivery in mind. Strong experience in real time ML solution deployment Experience with deployment patterns for Distributed Systems. Strong Python coding and package development skills. Experience with Big Data and analytics in general leveraging technologies like Hadoop, Spark, and MapReduce. Ability to work in a big data ecosystem - expert in SQL/Hive/Spark. About Walmart Global Tech: Imagine working in an environment where one line of code can make life easier for hundreds of millions of people and put a smile on their face. Thats what we do at Walmart Global Tech. Were a team of 15,000+ software engineers, data scientists and service professionals within Walmart, the worlds largest retailer, delivering innovations that improve how our customers shop and empower our 2.3 million associates. To others, innovation looks like an app, service, or some code, but Walmart has always been about people. People are why we innovate, and people power our innovations. Being human led is our true disruption. Flexible, hybrid work: We use a hybrid way of working that is primarily in office coupled with virtual when not onsite. Our campuses serve as a hub to enhance collaboration, bring us together for purpose and deliver on business needs. This approach helps us make quicker decisions, remove location barriers across our global team and be more flexible in our personal lives. Benefits: Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Equal Opportunity Employer: Walmart, Inc. is an Equal Opportunity Employer By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing diversity- unique styles, experiences, identities, ideas and opinions while being inclusive of all people. Minimum Qualifications. Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelors degree in Statistics, Economics, Analytics, Mathematics, Computer Science, Information Technology or related field and 5 years" experience in an analytics related field. Option 2: Masters degree in Statistics, Economics, Analytics, Mathematics, Computer Science, Information Technology or related field and 3 years" experience in an analytics related field. Option 3: 7 years" experience in an analytics or related field. Preferred Qualifications. Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Primary Location. G, 1, 3, 4, 5 Floor, Building 11, Sez, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli , India R-1925040,

Posted 4 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

Department: IT Project Location(s): Bangalore, Karnataka Job Type: Full Time Education: Bachelor in Engineering / Technology Senior Devops Engineer: Exp: 8+ Years Responsibilities: Manage the successful outcomes of all deliverables within the resource and time constraints as defined by the DevOps Manager and Delivery lead Participate in project stand-ups by effectively communicating status and impediments to progress Attend iteration planning sessions and participate in the sizing of the testing effort required to complete stories, ensuring external dependencies are considered Work closely and communicate effectively with the development team and product stakeholders to ensure project outcomes adhere to agreed quality standards Required skills and experience: Extensive cloud experience with AWS (VPC, EC2, Route53, IAM, STS, RDS, API GW) with a strong emphasis on security and well-architected solutions Sound grasp of security best practices in the cloud architectures Familiarity with configuration management (experience with Ansible, Puppet or Chef is a bonus) Being competent in configuring and administering a Linux system, including knowledge and experience with Bash Solid coding/scripting skills in at least 1 major language with a strong preference for one of the following: Python, Golang, Javascript Strong experience with infrastructure as code tools (AWS CloudFormation preferred) Solid experience in building and maintaining Docker images Understanding of and hands-on experience in using CI/CD tools such as AWS CodePipeline (preferred), Github Actions, TeamCity, Bamboo, GoCD, or Concourse CI Solid understanding of TCP/IP, networking and routing protocols Experience in implementing CDN/DDoS/WAF technologies Strong understanding and experience with Identity & Access Management solutions Experience in Web and API development and architectures (Event Driven, Microservices, SOA) Bonus points for: AWS Associate and/or Professional Level Certifications Experience working in Agile software and product development teams Experience with AWSs Machine Learning/AI suite, including SageMaker, Rekognition, Transcribe etc Strong skill sets in Web and API development and architectures (Event Driven, SOA, Microservices) Experience with AWS CloudFront & WAF or Akamai / Cloudflare Experience with Docker clustering and management Experience with AWSs Big Data & Analytics services, such as Glue, Redshift etc Cloud migration and delivery exposure This is custom heading element,

Posted 4 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

You should be strong in data structures and algorithms, and have experience working on a large scale consumer product. It is essential that you have worked on distributed and microservice architecture, and possess a solid understanding of scale, performance, and memory optimization fundamentals. Requirements And Skills - You should hold a BS/MS/BTech/MTech degree in Computer Science, Engineering, or a related field. - You must have a minimum of 4-8 years of experience in Java/J2EE Technologies. - Experience in designing open APIs and implementing oAuth2 is required. - Proficiency in Kafka, JMS, RabbitMQ, and AWS Elastic Queue is a must. - You should have hands-on experience with Spring, Hibernate, Tomcat, Jetty, and Undertow in a production environment. - Familiarity with Junit, Mockito for unit test cases, and MySQL or any other RDBMS is necessary. - Proven experience in software development and Java development is essential. - Hands-on experience in designing and developing applications using Java EE platforms is required. - Knowledge of Object-Oriented analysis and design using common design patterns is expected. Preferred - Experience in handling high traffic applications is a plus. - Familiarity with MongoDB, Redis, CouchDB, DynamoDB, and Riak is preferred. - Experience in Asynchronous Programming (Actor model concurrency, RxJava, Executor Framework) is a bonus. - Knowledge of Lucene, ElasticSearch, Solr, Jenkins, and Docker is advantageous. - Experience in other languages/technologies like Scala, NodeJs, PHP is a plus. - Experience in AWS, Google, Azure Cloud for managing, monitoring, and hosting servers is a bonus. - Experience in handling Big Data and knowledge of WebSocket and backend server for WebSocket is preferred.,

Posted 4 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

bhopal, madhya pradesh

On-site

You are invited to apply for the position of Assistant / Associate Professor in the Computer Science & Engineering Department at Sagar Institute of Science & Technology. The ideal candidate should hold a B.Tech, M.Tech, and Ph.D. in disciplines such as CSE, IT, AIML, Image Processing, Machine Learning, Neural Networks, Deep Learning, or Data Science from a reputable institute and a recognized university. To excel in this role, you should possess a strong command over programming languages including C, C++, Java, and SQL. Additionally, you should have expertise in subjects such as Basic Computer Engineering, Object-oriented programming, Data Structure, DBMS, Artificial Intelligence, Machine Learning, Neural Networks, Deep Learning, Software Engineering, Security & Privacy, Python, Big Data, Cloud Computing, and Natural Language Processing. This position is based at Sagar Institute of Science & Technology, SISTec Gandhi Nagar Campus, located opposite the International Airport in Bhopal, Madhya Pradesh - 462036. If you meet the above qualifications and are passionate about teaching in the field of Computer Science & Engineering, please send your updated CV to hrd@sistec.ac.in. For more information about career opportunities at SISTec, please visit our careers page at https://www.sistec.ac.in/careers.,

Posted 4 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

noida, uttar pradesh

On-site

You are an experienced Python developer who will be responsible for writing and testing scalable code, developing back-end components, and integrating user-facing elements in collaboration with front-end developers. You should have worked with various Python Frameworks like Django, possess good knowledge of databases like MySql, MongoDB, Hadoop, Big Data, and Docker. Additionally, you should have a good understanding of API development in XML and JSON, experience with at least one cloud provider such as Google Cloud or AWS, and have worked on Google APIs integration, restful APIs, and GIT. It is important to have familiarity with SOA (micro-services/message buses/ etc.), the ability to come up with innovative ideas, and transform requirements into scalable and smart code. You will be responsible for unit design and coding, implementing and following standards and guidelines with coding best practices in mind, ensuring bug-free and timely delivery of allocated development tasks by working with developers and architects, conducting proper unit testing, and taking ownership of a product end-to-end. Requirements for this role include a degree in Computer Science or related field, critical thinking and problem-solving skills, the ability to contribute individually, good time-management skills, great interpersonal skills, and 1-3 years of sound experience in the mentioned skill sets. This is a full-time job with a day shift schedule and requires in-person work at the designated location.,

Posted 4 weeks ago

Apply

6.0 - 14.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You should have a solid experience of 6 to 14 years in Data Modeling/ ER Modeling. As a candidate for this position, you should possess knowledge of relational databases and data architecture computer systems, including SQL. It is preferred that you have familiarity with or a bachelors degree in computer science, data science, information technology, or data modeling. In this role, you will be expected to have a good understanding of ER modeling, big data, enterprise data, and physical data models. Additionally, experience with data modeling software such as SAP PowerDesigner, Microsoft Visio, or Erwin Data Modeler would be beneficial. The job location is in Coimbatore, and there will be a walk-in interview scheduled on 12th April. If you meet the requirements and are passionate about data modeling and ER modeling, we encourage you to apply for this exciting opportunity.,

Posted 4 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Pune

Hybrid

Software Engineer - Specialist What youll do Demonstrate a deep understanding of cloud-native, distributed microservice-based architectures. Deliver solutions for complex business problems through standard Software Development Life Cycle (SDLC) practices. Build strong relationships with both internal and external stakeholders, including product, business, and sales partners. Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed. Lead strong technical teams that deliver complex software solutions that scale. Work across teams to integrate our systems with existing internal systems. Participate in a tight-knit, globally distributed engineering team. Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure. Leverage strong experience in full-stack software development and public cloud platforms like GCP and AWS. Mentor, coach, and develop junior and senior software, quality, and reliability engineers. Ensure compliance with secure software development guidelines and best practices. Define, maintain, and report SLA, SLO, and SLIs meeting EFX engineering standards in partnership with the product, engineering, and architecture teams. Collaborate with architects, SRE leads, and other technical leadership on strategic technical direction, guidelines, and best practices. Drive up-to-date technical documentation including support, end-user documentation, and run books. Responsible for implementation architecture decision-making associated with Product features/stories and refactoring work decisions. Create and deliver technical presentations to internal and external technical and non-technical stakeholders, communicating with clarity and precision, and presenting complex information in a concise, audience-appropriate format. What experience you need Bachelor's degree or equivalent experience. 5+ years of software engineering experience. 5+ years experience writing, debugging, and troubleshooting code in mainstream Java and SpringBoot. 5+ years experience with Cloud technology: GCP, AWS, or Azure. 5+ years experience designing and developing cloud-native solutions. 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes. 5+ years experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others. 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understanding infrastructure-as-code concepts, Helm Charts, and Terraform constructs. What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills. Strong leadership qualities. Demonstrated problem-solving skills and the ability to resolve conflicts. Experience creating and maintaining product and software roadmaps. Working in a highly regulated environment. Experience on GCP in Big data and distributed systems - Dataflow, Apache Beam, Pub/Sub, BigTable, BigQuery, GCS. Experience with backend technologies such as JAVA/J2EE, SpringBoot, Golang, gRPC, SOA, and Microservices. Source code control management systems (e.g., SVN/Git, Github, Gitlab), build tools like Maven & Gradle, and CI/CD like Jenkins or Gitlab. Agile environments (e.g., Scrum, XP). Relational databases (e.g., SQL Server, MySQL). Atlassian tooling (e.g., JIRA, Confluence, and Github). Developing with modern JDK (v1.7+). Automated Testing: JUnit, Selenium, LoadRunner, SoapUI.

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies