Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Scala Good to have skills : Java Enterprise Edition, Java Full Stack Development, .Net Full Stack DevelopmentMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements in Mumbai. You will collaborate with teams to ensure successful project delivery and contribute to key decisions. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of scalable applications- Conduct code reviews and provide technical guidance to team members- Stay updated with industry trends and technologies to enhance application development Professional & Technical Skills: - Must To Have Skills: Proficiency in Scala- Good To Have Skills: Experience with Java Enterprise Edition- Strong understanding of software development principles- Hands-on experience in building and optimizing applications- Knowledge of database management systems- Familiarity with agile methodologies Additional Information:- The candidate should have a minimum of 7.5 years of experience in Scala- This position is based at our Mumbai office- A 15 years full-time education is required Qualification 15 years full time education
Posted 6 days ago
2.0 - 5.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data visualization techniques and tools. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full time education is required.- Candidate should be ready to work in rotational shift Qualification 15 years full time education
Posted 6 days ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Scala, Python (Programming Language), Microsoft Azure Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and debugging processes to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Scala, Microsoft Azure Data Services, Python (Programming Language).- Strong understanding of application development methodologies.- Experience with cloud-based application deployment and management.- Familiarity with version control systems such as Git.- Knowledge of database management and data integration techniques. Additional Information:- The candidate should have minimum 3 years of experience in Scala.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 6 days ago
4.0 - 9.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that the applications developed meet both user needs and technical requirements. Your role will be pivotal in fostering a collaborative environment that encourages innovation and problem-solving among team members. Roles & Responsibilities:Minimum of 4 years of experience in data engineering or similar roles.Proven expertise with Databricks and data processing frameworks.Technical Skills SQL, Spark, Py spark, Databricks, Python, Scala, Spark SQLStrong understanding of data warehousing, ETL processes, and data pipeline design.Experience with SQL, Python, and Spark.Excellent problem-solving and analytical skills.Effective communication and teamwork abilities. Professional & Technical Skills: Experience and knowledge of Azure SQL Database, Azure Data Factory, ADLS Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Azure Data Services.- This position is based in Pune.- A 15 year full time education is required. Qualification 15 years full time education
Posted 6 days ago
5.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Scala, PySparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the needs of the organization and its clients. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with PySpark, Scala.- Strong understanding of data engineering principles and practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 6 days ago
0 years
4 - 9 Lacs
Bengaluru
Remote
Your opportunity Do you love the transformative impact data can have on a business? Are you motivated to push for results and overcome all obstacles? Then we have a role for you. What you'll do Lead the building of scalable, fault tolerant pipelines with built in data quality checks that transform, load and curate data from various internal and external systems Provide leadership to cross-functional initiatives and projects. Influence architecture design and decisions. Build cross-functional relationships with Data Scientists, Product Managers and Software Engineers to understand data needs and deliver on those needs. Improve engineering processes and cross-team collaboration. Ruthlessly prioritize work to align with company priorities. Provide thought leadership to grow and evolve DE function and implementation of SDLC best practices in building internal-facing data products by staying up-to-date with industry trends, emerging technologies, and best practices in data engineering This role requires Experience in BI and Data Warehousing. Strong experience with dbt, Airflow and snowflake Experience with Apache Iceberg tables Experience and knowledge of building data-lakes in AWS (i.e. Spark/Glue, Athena), including data modeling, data quality best practices, and self-service tooling. Experience mentoring data professionals from junior to senior levels Demonstrated success leading cross functional initiatives Passionate about data quality, code quality, SLAs and continuous improvement Deep understanding of data system architecture Deep understanding of ETL/ELT patterns Development experience in at least one object-oriented language (Python,R,Scala, etc.). Comfortable with SQL and related tooling Bonus points if you have Experience with Observability Please note that visa sponsorship is not available for this position. Fostering a diverse, welcoming and inclusive environment is important to us. We work hard to make everyone feel comfortable bringing their best, most authentic selves to work every day. We celebrate our talented Relics’ different backgrounds and abilities, and recognize the different paths they took to reach us – including nontraditional ones. Their experiences and perspectives inspire us to make our products and company the best they can be. We’re looking for people who feel connected to our mission and values, not just candidates who check off all the boxes. If you require a reasonable accommodation to complete any part of the application or recruiting process, please reach out to resume@newrelic.com. We believe in empowering all Relics to achieve professional and business success through a flexible workforce model. This model allows us to work in a variety of workplaces that best support our success, including fully office-based, fully remote, or hybrid. Our hiring process In compliance with applicable law, all persons hired will be required to verify identity and eligibility to work and to complete employment eligibility verification. Note: Our stewardship of the data of thousands of customers’ means that a criminal background check is required to join New Relic. We will consider qualified applicants with arrest and conviction records based on individual circumstances and in accordance with applicable law including, but not limited to, the San Francisco Fair Chance Ordinance. Headhunters and recruitment agencies may not submit resumes/CVs through this website or directly to managers. New Relic does not accept unsolicited headhunter and agency resumes, and will not pay fees to any third-party agency or company that does not have a signed agreement with New Relic. Candidates are evaluated based on qualifications, regardless of race, religion, ethnicity, national origin, sex, sexual orientation, gender expression or identity, age, disability, neurodiversity, veteran or marital status, political viewpoint, or other legally protected characteristics. Review our Applicant Privacy Notice at https://newrelic.com/termsandconditions/applicant-privacy-policy
Posted 6 days ago
7.0 years
0 Lacs
Noida
On-site
Senior Software Engineer II - Metrics Platform Location: Noida, Uttar Pradesh, India Sumo Logic is a cloud-native SaaS data analytics platform, solving complex observability and security problems. Customers choose our product because it allows them to easily monitor, optimize and secure their applications, systems and infrastructures. Our microservices architecture hosted on AWS ingests petabytes of data daily across many geographic regions. Millions of queries a day analyze hundreds of petabytes of data. The Metrics Platform Team is responsible for building scalable, cost-efficient and highly available backend services for Sumo Logic Infrastructure Monitoring solution. You will join a team of engineers responsible for ingest, query and storage components of a distributed time series database engine. What you will do: Develop features using algorithms that work optimally on very large data sets. These features would run on our backend systems that can handle enormous amounts of customer data, operate with high tolerance for errors, and can scale up as needed. These systems are responsible for managing and processing petabytes of data. Follow test driven development. Write robust & re-usable code, demonstrate its robustness through automated tests. Own a small set of microservices, analyze and improve the efficiency, scalability, reliability and cost for the same. Helping the team respond quickly and effectively to business needs. Act as a primary on-call for owned services, responding to service degradations in a timely manner. What you will have: BSc or MSc in Computer Science or a related discipline. 7-8 years of industry experience with a proven track record of ownership. Object-oriented experience, for example in Java, Scala, Ruby, or C++. Understanding the performance characteristics of commonly used data structures (maps, lists, trees, etc). Desire to learn Scala, an up-and-coming JVM language (scala-lang.org). Experience in multi-threaded programming Experience in running large scalable distributed services following a microservice architecture Desirable skills: Experience in big data and/or 24x7 commercial service is highly desirable. You should be happy working with Unix (Linux, OS X). Agile software development experience (test-driven development, iterative and incremental development) is a plus. About Us Sumo Logic, Inc. empowers the people who power modern, digital business. Sumo Logic enables customers to deliver reliable and secure cloud-native applications through its Sumo Logic SaaS Analytics Log Platform, which helps practitioners and developers ensure application reliability, secure and protect against modern security threats, and gain insights into their cloud infrastructures. Customers worldwide rely on Sumo Logic to get powerful real-time analytics and insights across observability and security solutions for their cloud-native applications. For more information, visit www.sumologic.com. Sumo Logic Privacy Policy. Employees will be responsible for complying with applicable federal privacy laws and regulations, as well as organizational policies related to data protection.
Posted 6 days ago
7.0 years
0 Lacs
Noida
On-site
Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences. We’re passionate about empowering people to craft beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to building exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Digital Experience (DX) (https://www.adobe.com/experience-cloud.html) is a USD 3B+ business serving the needs of enterprise businesses including 95%+ of fortune 500 organizations. Adobe Journey Optimizer (AJO) within DX provides a platform for designing cross-channel customer experiences and provides an environment for visual campaign orchestration, real time interaction management and cross channel execution. It is built natively on the Adobe Experience Platform and combines a unified, real-time customer profile, an API-first open framework, centralized offer decisioning, and artificial intelligence (AI) and machine learning (ML) for personalization and optimization. Beyond the usual responsibility of designing, developing, documenting, and thoroughly testing code, Computer Scientists @ Adobe would own features of varying complexity, which may require understanding interactions with other parts of the system, moderately sophisticated algorithms and good design judgment. We are looking for strong and passionate engineers to join our team as we scale the business by building the next gen products and contributing to our existing offerings. What you'll do This is an individual contributor position. Expectations will be on the below lines: Responsible for design and architecture of new products. Work in full DevOps mode, be responsible for all phases of engineering. From early specs, design/architecture, technology choice, development, unit-testing/integration automation, and deployment. Collaborate with architects, product management and other engineering teams to build the technical vision, and road map for the team. Build technical specifications, prototypes and presentations to communicate your ideas. Be well versed in emerging industry technologies and trends, and have the ability to communicate that knowledge to the team and use it to influence product direction. Orchestrate with team to develop a product or parts of a large product. Requirements B.Tech / M.Tech degree in Computer Science from a premier institute. 7-9.5years of relevant experience in software development. Should have excellent computer science fundamentals and a good understanding of design, and performance of algorithms Proficient in Java/Scala Programming Proficient in writing code that is reliable, maintainable, secure, and performant Knowledge of Azure services and/or AWS. Internal Opportunities We’re glad that you’re pursuing career development opportunities at Adobe. Here’s what you’ll need to do: Apply with your complete LinkedIn profile or resume/CV. Schedule a Check-in meeting with your manager to discuss this internal opportunity and your career aspirations. Check-ins should include ongoing discussions about expectations, feedback and career development. Learn more about Check-in here. Learn more about the internal career opportunities process in this FAQ. If you’re contacted for an interview, here are some tips. At Adobe, you will be immersed in an exceptional work environment that is recognized throughout the world on Best Companies lists. You will also be surrounded by colleagues who are committed to helping each other grow through our unique Check-In approach where ongoing feedback flows freely. If you’re looking to make an impact, Adobe's the place for you. Discover what our employees are saying about their career experiences on the Adobe Life blog and explore the meaningful benefits we offer. Adobe is an equal opportunity employer. We welcome and encourage diversity in the workplace regardless of gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, or veteran status. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.
Posted 6 days ago
10.0 years
0 Lacs
India
On-site
The Impact - The Director Data Engineering will lead the development and implementation of a comprehensive data strategy that aligns with the organization’s business goals and enables data driven decision making. You will : Build and manage a team of talented data managers and engineers with the ability to not only keep up with, but also pioneer, in this space Collaborate with and influence leadership to directly impact company strategy and direction Develop new techniques and data pipelines that will enable various insights for internal and external customers Develop deep partnerships with client implementation teams, engineering and product teams to deliver on major cross-functional measurements and testing Communicate effectively to all levels of the organization, including executives Provide success in partnering teams with dramatically varying backgrounds, from the highly technical to the highly creative Design a data engineering roadmap and execute the vision behind it Hire, lead, and mentor a world-class data team Partner with other business areas to co-author and co-drive strategies on our shared roadmap Oversee the movement of large amounts of data into our data lake Establish a customer-centric approach and synthesize customer needs Own end-to-end pipelines and destinations for the transfer and storage of all data Manage 3rd-party resources and critical data integration vendors Promote a culture that drives autonomy, responsibility, perfection and mastery. Maintain and optimize software and cloud expenses to meet financial goals of the company Provide technical leadership to the team in design and architecture of data products and drive change across process, practices, and technology within the organization Work with engineering managers and functional leads to set direction and ambitious goals for the Engineering department Ensure data quality, security, and accessibility across the organization About you: 10+ years of experience in data engineering 5+ years of experience leading data teams of 30+ resources or more, including selection of talent planning / allocating resources across multiple geographies and functions. 5+ years of experience with GCP tools and technologies, specifically, Google BigQuery, Google cloud composer, Dataflow, Dataform, etc. Experience creating large-scale data engineering pipelines, data-based decision-making and quantitative analysis tools and software Experience with hands-on to code version control systems (git) Experience with CICD, data architectures, pipelines, quality, and code management Experience with complex, high volume, multi-dimensional data, based on unstructured, structured, and streaming datasets Experience with SQL and NoSQL databases Experience creating, testing, and supporting production software and systems Proven track record of identifying and resolving performance bottlenecks for production systems Experience designing and developing data lake, data warehouse, ETL and task orchestrating systems Strong leadership, communication, time management and interpersonal skills Proven architectural skills in data engineering Experience leading teams developing production-grade data pipelines on large datasets Experience designing a large data lake and lake house experience, managing data flows that integrate information from various sources into a common pool implementing data pipelines based on the ETL model Experience with common data languages (e.g. Python, Scala) and data warehouses (e.g. Redshift, BigQuery, Snowflake, Databricks) Extensive experience on cloud tools and technologies - GCP preferred Experience managing real-time data pipelines Successful track record and demonstrated thought-leadership and cross-functional influence and partnership within an agile / water-fall development environment. Experience in regulated industries or with compliance frameworks (e.g., SOC 2, ISO 27001). Write to sanish@careerxperts.com to get connected !
Posted 6 days ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Role Overview: As a Senior AI/ML Engineer, you will lead the design, development, and deployment of advanced machine learning models and AI solutions. Your expertise will drive innovation across various applications, from natural language processing (NLP) to deep learning and recommendation systems. You will collaborate with cross-functional teams to integrate AI capabilities into products and services, ensuring scalability, efficiency, and ethical standards. Key Responsibilities: Model Development & Deployment: Lead the end-to-end lifecycle of machine learning models, including data collection, preprocessing, model selection, training, evaluation, and deployment. Implement deep learning architectures (e.g., CNNs, RNNs, Transformers) and NLP models to solve complex problems. Optimize models for performance, scalability, and real-time inference. Collaboration & Leadership: Mentor and guide junior engineers and data scientists, fostering a culture of continuous learning and innovation. Collaborate with product managers, software engineers, and data engineers to align AI solutions with business objectives. Communicate complex technical concepts to both technical and non-technical stakeholders. Research & Innovation: Stay abreast of the latest advancements in AI/ML research and industry trends. Evaluate and integrate emerging technologies and methodologies to enhance model performance and capabilities. Contribute to the development of AI strategies and roadmaps. Ethics & Compliance: Ensure AI solutions adhere to ethical guidelines, data privacy regulations, and organizational standards. Implement fairness, accountability, and transparency in AI models to mitigate bias and ensure equitable outcomes. Technical Skills & Qualifications: Programming Languages: Proficiency in Python, with experience in libraries such as NumPy, Pandas, and Scikit-learn. Familiarity with other languages like Java, Scala, or C++ is a plus. Machine Learning Frameworks: Expertise in frameworks like TensorFlow, PyTorch, Keras, and Scikit-learn. Experience with MLOps tools and practices for model versioning, deployment, and monitoring. Data Engineering: Strong understanding of data pipelines, ETL processes, and working with large-scale datasets. Experience with SQL and NoSQL databases, as well as cloud platforms like AWS, GCP, or Azure. Soft Skills: Excellent problem-solving abilities and analytical thinking. Strong communication and interpersonal skills. Ability to work effectively in a collaborative, fast-paced environment. Educational Background: Bachelor's or master's degree in computer science, data science, machine learning, or a related field. Ph.D. is advantageous but not required. Preferred Experience: 8+ years in AI/ML engineering or related roles, with a proven track record of deploying production-grade models. Experience in specific domains such as healthcare, finance, or e-commerce is a plus. Why Join Us: Opportunity to work on cutting-edge AI projects with a talented team. Collaborative and inclusive work culture. Competitive compensation and benefits package. Continuous learning and professional development opportunities.
Posted 6 days ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
✅ Job Title: Data Engineer – Apache Spark, Scala, GCP & Azure 📍 Location: Gurugram (Hybrid – 3 days/week in office) 🕒 Experience: 5–10 Years 🧑💻 Type: Full-time 📩 Apply: Share your resume with the details listed below to vijay.s@xebia.com 🕐 Availability: Immediate joiners or max 2 weeks' notice period only 🚀 About the Role Xebia is looking for a skilled Data Engineer to join our fast-paced team in Gurugram. You will work on building and optimizing scalable data pipelines, processing large datasets using Apache Spark and Scala , and deploying on cloud platforms like GCP and Azure . If you're passionate about clean architecture, high-quality data flow, and performance tuning, this is the opportunity for you. 🔧 Key Responsibilities Design and develop robust ETL pipelines using Apache Spark Write clean and efficient data processing code in Scala Handle large-scale data movement, transformation, and storage Build solutions on Google Cloud Platform (GCP) and Microsoft Azure Collaborate with teams to define data strategies and ensure data quality Optimize jobs for performance and cost on distributed systems Document technical designs and ETL flows clearly for the team ✅ Must-Have Skills Apache Spark Scala ETL design & development Cloud platforms: GCP & Azure Strong understanding of Data Engineering best practices Solid communication and collaboration skills 🌟 Good-to-Have Skills Apache tools (Kafka, Beam, Airflow, etc.) Knowledge of data lake and data warehouse concepts CI/CD for data pipelines Exposure to modern data monitoring and observability tools 💼 Why Xebia? At Xebia, you’ll be part of a forward-thinking, tech-savvy team working on high-impact, global data projects. We prioritize clean code, scalable solutions, and continuous learning. Join us to build real-time, cloud-native data platforms that power business intelligence across industries. 📤 To Apply Please share your updated resume and include the following details in your email to vijay.s@xebia.com : Full Name: Total Experience: Current CTC: Expected CTC: Current Location: Preferred Xebia Location: Gurugram Notice Period / Last Working Day (if serving): Primary Skills: LinkedIn Profile URL: Note: Only candidates who can join immediately or within 2 weeks will be considered. Build intelligent, scalable data solutions with Xebia – let’s shape the future of data together. 📊🚀
Posted 6 days ago
1.0 years
0 Lacs
Chandigarh, Chandigarh
On-site
Role: Big Data Engineer (Fresher) Experience: 0–1 Years Location: Chandigarh Responsibilities As an entry-level Big Data Engineer, you will work closely with experienced team members to help design, build, and maintain high-performance data solutions. You will assist in developing scalable pipelines, Spark-based processing jobs, and contribute to RESTful services that support data-driven products. This is a hands-on learning opportunity where you will be mentored and exposed to real-world Big Data technologies, DevOps practices, and collaborative agile teams. Your key responsibilities will include: Assisting in the design and development of data pipelines and streaming applications. Learning to work with distributed systems and Big Data frameworks. Supporting senior engineers in writing and testing code for data processing. Participating in code reviews, team discussions, and product planning sessions. Collaborating with cross-functional teams including product managers and QA. Qualifications and Skills Bachelor's degree in Computer Science, Engineering, or related field. Good understanding of core programming concepts (Java, Python, or Scala preferred). Familiarity with SQL and NoSQL databases. Basic knowledge of Big Data tools such as Spark, Hadoop, Kafka (academic/project exposure acceptable). Exposure to Linux/Unix environments. Awareness of Agile methodologies (Scrum, Kanban) and DevOps tools like Git. Curiosity to learn cloud platforms like AWS or GCP (certifications a plus). Willingness to learn about system security (Kerberos, TLS, etc.). Nice to Have (Not Mandatory): Internships, academic projects, or certifications related to Big Data. Contributions to open-source or personal GitHub projects. Familiarity with containerization (Docker, Kubernetes) or CI/CD tools. Job Types: Full-time, Permanent Pay: Up to ₹331,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Rotational shift Supplemental Pay: Performance bonus Work Location: In person
Posted 6 days ago
12.0 - 17.0 years
12 - 17 Lacs
Pune
Work from Office
Role Overview: The Technical Architect specializing in Data Governance and Master Data Management (MDM) designs, implements, and optimizes enterprise data solutions. The jobholder has expertise in tools like Collibra, Informatica, InfoSphere, Reltio, and other MDM platforms, ensuring data quality, compliance, and governance across the organization. Responsibilities: Architect and optimize strategies for data quality, metadata management, and data stewardship. Design and implement data governance frameworks and MDM solutions using tools like Collibra, Informatica, InfoSphere, and Reltio. Develop strategies for data quality, metadata management, and data stewardship. Collaborate with cross-functional teams to integrate MDM solutions with existing systems. Establish best practices for data governance, security, and compliance. Monitor and troubleshoot MDM environments for performance and reliability. Provide technical leadership and guidance to data teams. Stay updated on advancements in data governance and MDM technologies. Key Technical Skills & Responsibilities Overall 12+Yrs of Experience 10+ years of experience working on DG/MDM projects Strong on Data Governance concepts Hands-on different DG tools/services Hands-on Reference Data, Taxonomy Strong understanding of Data Governance, Data Quality, Data Profiling, Data Standards, Regulations, Security Match and Merge strategy Design and implement the MDM Architecture and Data Models Usage of Spark capabilities Statistics to deduce meanings from vast enterprise level data Different data visualization means of analyzing huge data sets Good to have knowledge of Python/R/Scala languages Experience on DG on-premise and on-cloud Understanding of MDM, Customer, Product, Vendor Domains and related artifacts Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Technology Stack - Collibra, IBM MDM, Reltio, Infosphere Eligibility Criteria: Bachelors degree in Computer Science, Data Management, or a related field. Proven experience as a Technical Architect in Data Governance and MDM. Certifications in relevant MDM tools (e.g., Collibra Data Governance, Informatica / InfoSphere / Reltio MDM, ). Experience with cloud platforms like AWS, Azure, or GCP. Proficiency in tools like Collibra, Informatica, InfoSphere, Reltio, and similar platforms. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Excellent problem-solving and communication skills.
Posted 6 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At Flipkart, SDE-2 are engineers who create features based on product requirements. You’re expected to design and code in multiple tech components related to your functional area. You’re required to learn the best practices and design principles and patterns to make the code-base maintainable and extensible. You must also develop a deep understanding of non-functional requirements, such as reliability and availability, scale, horizontal scalability, etc. over time and make tech stack decisions accordingly. We are looking for engineers who are well rounded - quality conscious, product thinkers, business cognizant and smart – not mere coders. Engineers get to significantly amplify their impact with the scale that Flipkart operates at. Responsibilitie sDesign components by translating product requirements, break down project into tasks and provide accurate estimate sIndependently come up with different solutions, extensibile Low level design. Write modular, extensible, readable and performant cod eChoose the right Data Structures, tools and tech stacks and be able to do High Level Designing with guidance .Build, develop, mentor and coach junior team member sCollaborate with teams by contributing to the shared vision and working closely with cross-functional stakeholders . What you’ll ne ed B.Tech or M.Tech or equivalent with at least 3-year of experie nceBuild abstractions and contracts with separation of concerns for a larger sco pe.Extensive programming experience in any one programming language like Java, Ruby, Clojure, Scala,C or C++, SQL etcStrong object-oriented programming skil ls.Experience with multi-threading and concurrency programm ingAbility to work with complex business flows and dealing with huge amounts of da ta.Prior work experience in an agile environment or continuous integration and continuous delivery (CI or CD)Experience of building robust and scalable web-application is good to h ave
Posted 6 days ago
12.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Qualification BTech degree in computer science, engineering or related field of study or 12+ years of related work experience 7+ years design & implementation experience with large scale data centric distributed applications Professional experience architecting, operating cloud-based solutions with good understanding of core disciplines like compute, networking, storage, security, databases etc. Good understanding of data engineering concepts like storage, governance, cataloging, data quality, data modeling etc. Good understanding about various architecture patterns like data lake, data lake house, data mesh etc. Good understanding of Data Warehousing concepts, hands-on experience working with tools like Hive, Redshift, Snowflake, Teradata etc. Experience migrating or transforming legacy customer solutions to the cloud. Experience working with services like AWS EMR, Glue, DMS, Kinesis, RDS, Redshift, Dynamo DB, Document DB, SNS, SQS, Lambda, EKS, Data Zone etc. Thorough understanding of Big Data ecosystem technologies like Hadoop, Spark, Hive, HBase etc. and other competent tools and technologies Understanding in designing analytical solutions leveraging AWS cognitive services like Textract, Comprehend, Rekognition etc. in combination with Sagemaker is good to have. Experience working with modern development workflows, such as git, continuous integration/continuous deployment pipelines, static code analysis tooling, infrastructure-as-code, and more. Experience with a programming or scripting language – Python/Java/Scala AWS Professional/Specialty certification or relevant cloud expertise Role Drive innovation within Data Engineering domain by designing reusable and reliable accelerators, blueprints, and libraries. Capable of leading a technology team, inculcating innovative mindset and enable fast paced deliveries. Able to adapt to new technologies, learn quickly, and manage high ambiguity. Ability to work with business stakeholders, attend/drive various architectural, design and status calls with multiple stakeholders. Exhibit good presentation skills with a high degree of comfort speaking with executives, IT Management, and developers. Drive technology/software sales or pre-sales consulting discussions Ensure end-to-end ownership of all tasks being aligned. Ensure high quality software development with complete documentation and traceability. Fulfil organizational responsibilities (sharing knowledge & experience with other teams / groups) Conduct technical training(s)/session(s), write whitepapers/ case studies / blogs etc. Experience 10 to 18 years Job Reference Number 12895
Posted 6 days ago
8.0 - 10.0 years
30 - 32 Lacs
Hyderabad
Work from Office
Candidate Specifications: Candidate should have 9+ years of experience. Candidates should have 9+ years of experience in Python and Pyspark Candidate should have strong experience in AWS and PLSQL. Candidates should be strong in Data management with data governance and data streaming along with data lakes and data-warehouse Candidates should also have exposure in Team handling and stakeholder management skills. Candidate should have excellent in written and verbal communication skills. Contact Person: Sheena Rakesh
Posted 6 days ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Avion manufactures Full Flight Simulators for the Airbus A320 Family and Boeing 737 NG & MAX. We operate Flight Training Centres at London Luton Airport and Mumbai, India. Gen24 Flybiz offers comprehensive services for aspiring pilots, airlines, and training organisations. In 2025, the Avion Flight Training Centre Mumbai - operated by Gen24 Flybiz - will be opened. At our facility, pilots can train on state-of-the-art Full Flight Simulators (FFS) and Flight Navigation Procedures Trainer (FNPTII) devices. Currently, our centre operates two Airbus A320neo Full Flight Simulators from Avion and an A320 FNPTII for APS MCC training, built by Simnest. Over the coming years, we will expand to six to eight Full Flight Simulators, including additional Airbus A320s and Boeing 737 MAX devices, providing comprehensive training solutions for airlines and individual pilots. Tasks Gen24 is looking for a Core Software Engineer to help develop core software for Full Flight Simulators. The core software facilitates the distributed real-time simulation of all models required for the simulation. It allows the user to interact with the simulation via the Instructor Operating System and generates the simulated graphics for the cockpit displays. It also consists of several Graphical User Interfaces (GUIs) used by developers and simulator maintenance personnel. Requirements Responsibilities Design and develop supporting tools for the core framework: Real-time monitoring Graphical User Interfaces Graphics Generator Editor Diagnostic Tools Mobile and Web Applications Maintain and upgrade key components of the core framework: Real-time scheduling Shared memory Multi-node syncing Graphics Generator Mobile and Web Applications Required Skills and Experience High analytical skills. Ability to translate high-level functional requirements and technical specifications into working products. Demonstrated experience with software development in C++, Scala, Java or a related language. Experience with software development for Windows, Linux and/or mobile platforms. Experience with GUI development, preferably in JavaFX or QT. Good verbal and written communication skills in English. Strong work ethic: comfortable in a fast-paced, entrepreneurial company environment. Ability to learn and adapt quickly to maximise productivity. Desirable Skills and Experience Affinity with Real-time simulation, distributed computing and multithreading. Understanding of data structures in memory and network protocols such as UDP and TCP. Understanding of Object-Oriented Programming and Design Patterns. Knowledge of the Scala (or Java) programming language. Knowledge of OpenGL. Familiarity with reverse engineering of code and troubleshooting. Experience in full-stack web development (MEAN, MERN, and/or others) is considered a big plus. Experience with Python and JavaScript. Experience with Scala and Svelte. Experience with markup languages (HTML, XML, LaTeX) and web application design. Experience with developing mobile applications, front- and backend. Location This job position is based at the Avion Flight Training Centre (operated by Gen24) in Mumbai, India. Benefits Become a part of Gen24 Working at Gen24 means having a challenging job in a successful and entrepreneurial environment where initiative and a high degree of freedom in acting are basic principles. Working together within and between teams is essential for our success. Likewise, we cooperate closely with our partners and customers to achieve the best results. You will have significant influence and responsibility for the outcome of technically challenging projects. Gen24 will create the conditions that enable you to truly grow as a (technical) specialist. We will do so by providing support, training, and opportunities to further develop your talents in a stimulating and inspiring environment. Gen24 is an equal-opportunity employer. We celebrate our inclusive work environment and encourage people of all backgrounds and perspectives to apply. At Gen24, we are committed to having an inclusive and transparent environment where every voice is heard and acknowledged. We embrace our differences and know that our diverse team is a strength that drives our success. Do you think you meet the criteria, and are you up for a new challenge? We look forward to hearing from you! You can apply using the Join.com webpage. Please include your motivation letter and resume.
Posted 6 days ago
10.0 years
0 Lacs
India
Remote
Job Description EMPLOYMENT TYPE: Full-Time, Permanent LOCATION: Remote (Pan India) SHIFT TIMINGS: 2.00 pm-11:00 pm IST Budget- As per company standards REPORTING: This position will report to our CEO or any other Lead as assigned by Management. The Senior Data Engineer will be responsible for building and extending our data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys working with big data and building systems from the ground up. You will collaborate with our software engineers, database architects, data analysts, and data scientists to ensure our data delivery architecture is consistent throughout the platform. You must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives. What You’ll Be Doing: ● Design and build parts of our data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources using the latest Big Data technologies. ● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. ● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. ● Work with machine learning, data, and analytics experts to drive innovation, accuracy and greater functionality in our data system. Qualifications: ● Bachelor's degree in Engineering, Computer Science, or relevant field. ● 10+ years of relevant and recent experience in a Data Engineer role. ● 5+ years recent experience with Apache Spark and solid understanding of the fundamentals. ● Deep understanding of Big Data concepts and distributed systems. ● Strong coding skills with Scala, Python, Java and/or other languages and the ability to quickly switch between them with ease. ● Advanced working SQL knowledge and experience working with a variety of relational databases such as Postgres and/or MySQL. ● Cloud Experience with DataBricks ● Experience working with data stored in many formats including Delta Tables, Parquet, CSV and JSON. ● Comfortable working in a linux shell environment and writing scripts as needed. ● Comfortable working in an Agile environment ● Machine Learning knowledge is a plus. ● Must be capable of working independently and delivering stable, efficient and reliable software. ● Excellent written and verbal communication skills in English. ● Experience supporting and working with cross-functional teams in a dynamic environment.
Posted 6 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary We are looking for a Data Engineer with strong experience in cloud platforms (AWS & Azure) , Scala programming , and a solid understanding of data architecture and governance frameworks . You will play a key role in building, optimizing, and maintaining scalable data pipelines and systems while ensuring data quality, security, and compliance across the organization. Key Responsibilities Data Engineering & Development Design and develop reliable, scalable ETL/ELT data pipelines using Scala , SQL , and orchestration tools. Integrate and process structured, semi-structured, and unstructured data from various sources (APIs, databases, flat files, etc.). Develop solutions on AWS (e.g., S3, Glue, Redshift, EMR) and Azure (e.g., Data Factory, Synapse, Blob Storage). Cloud & Infrastructure Build cloud-native data solutions that align with enterprise architecture standards. Leverage IaC tools (Terraform, CloudFormation, ARM templates) to deploy and manage infrastructure. Monitor performance, cost, and security posture of data environments in both AWS and Azure. Data Architecture & Governance Collaborate with data architects to define and implement logical and physical data models. Apply data governance principles including data cataloging , lineage tracking , data privacy , and compliance (e.g., GDPR) . Support the enforcement of data policies and data quality standards across data domains. Collaboration & Communication Work cross-functionally with data analysts, scientists, architects, and business stakeholders to support data needs. Participate in Agile ceremonies and contribute to sprint planning and reviews. Maintain clear documentation of pipelines, data models, and data flows. Required Qualifications Bachelor's degree in Computer Science, Engineering, or a related field. 3–6 years of experience in data engineering or data platform development. Hands-on experience with AWS and Azure data services. Proficient in Scala for data processing (e.g., Spark, Kafka Streams). Strong SQL skills and familiarity with distributed systems. Experience with orchestration tools (e.g., Apache Airflow, Azure Data Factory).
Posted 6 days ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Roles & Responsibilities: Professional Skill: Business Analysis, Analytical Thinking, Problem Solving, Decision Making, Leadership, Managerial, Time Management, Domain Knowledge Work simplification - methods that maximize output while minimizing expenditure and cost. Analytics with Data - interprets data and turns it into information which can offer ways to improve a business Communication - Good verbal communication and interpersonal skills are essential for collaborating with customers Technical Skills: Python/Numpy, Seaborn, Pandas, Selenium, Beautiful Soup (basic), Spotfire, ML Libraries, RPA, R, Iron-Python, Html CSS, Javascript, SQL, HQL, Git/Gitlabee, Spark, Scala, Webservices, Spotfire/ Tableau, JIRA Tool Skill: Project management tools, Documentation tools, Modeling [wireframe] tools Database Skills: MsSQL, Postgres, MsAccess, Mongo DB Rigorous - The ability to analyse qualitative data quickly and rigorously Adaptability - Being able to adapt to changing environments and work processes Experience : Minimum 8 years of experience in Data Science / Preferable in Automobile Engineering Domain What we look from candidate who has matching skills on below. Key Skills: DAO - AI/ML, Automation, Python, big query. Project Management*, Agile, SDLC. Location : Chennai - Mahindra World City Campus. Work Mode : Hybrid
Posted 6 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
What you should expect to do in this role? ● You will be closely working with a small-hybrid team of artists, engineers, product managers and user experience designers; in a very fast paced environment. ● You will be developing software; which not only works but also entertains your customers. Thus customer experience matters most. You are someone who takes deep pride in developing such experiences. ● You will be developing new game systems as well as enhancing existing game systems. You will often find yourself aggressively optimizing code to deliver better user experience. Analyzing and improving efficiency & stability of various systems will be critical. ● You are someone who takes full ownership and full accountability of things you build and deliver (from conception, speccing and design through testing and deployment and post release enhancements). “Must Have” Past Experience: Candidates must have these three experiences: ● You are a full stack engineer with 3+ years experience in building user facing web and mobile applications. ● You have strong command in any one or more of the following languages and frameworks : Java/Scala, JS/TypeScript+NodeJS, Golang or C++. ● You have experience working with databases, Key-Value stores and Queuing systems. “Preferred” Past Experience: Candidates with following experience will be preferred over others: ● Exposure to game development and game engine like " Unity/Cocos". ● Exposure to developing for scale would be a plus. ● A stint in startups or simillar fast paced environments would be prefered. ● Experience working with cloud computing platforms like AWS or GCP or Azure would be a huge plus.
Posted 6 days ago
5.5 years
0 Lacs
Mumbai, Maharashtra, India
On-site
About the Company KPMG in India is a leading professional services firm established in August 1993. The firm offers a wide range of services, including audit, tax, and advisory, to national and international clients across various sectors. KPMG operates from offices in 14 cities, including Mumbai, Bengaluru, Chennai, and Delhi. KPMG India is known for its rapid, performance-based, industry-focused, and technology-enabled services. The firm leverages its global network to provide informed and timely business advice, helping clients mitigate risks and seize opportunities. KPMG India is committed to quality and excellence, fostering a culture of growth, innovation, and collaboration. About the job: Spark/Scala Developer Experience: 5.5 to 9 years Location: Mumbai We are seeking a skilled Spark/Scala Developer with 5.5 - 9 years of experience in Big Data engineering. The ideal candidate will have strong expertise in Scala programming, SQL, and data processing using Apache Spark within Hadoop ecosystems. Key Responsibilities: Design, develop, and implement data ingestion and processing solutions for batch and streaming workloads using Scala and Apache Spark. Optimize and debug Spark jobs for performance and reliability. Translate functional requirements and user stories into scalable technical solutions. Develop and troubleshoot complex SQL queries to extract business-critical insights. Required Skills: 2+ years of hands-on experience in Scala programming and SQL. Proven experience with Hadoop Data Lake and Big Data tools. Strong understanding of Spark job optimization and performance tuning. Ability to work collaboratively in an Agile environment. Equal Opportunity Statement KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Posted 6 days ago
2.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Risk Management Level Associate Job Description & Summary In-depth knowledge of application development processes and at least one programming and one scripting language (e.g., Java, Scala, C#, JavaScript, Angular, ReactJs, Ruby, Perl, Python, Shell). •Knowledge on OS security (Windows, Unix/Linux systems, Mac OS, VMware), network security and cloud security. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are seeking a professional to join our Cybersecurity and Privacy services team, where you will have the opportunity to help clients implement effective cybersecurity programs that protect against threats. Responsibilities: L1 - Minimum 2 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/ Vulnerability Management/ SOC platform management/ Automation/Asset Integration/ Threat Intel Management /Threat Hunting. L2 - Minimum 4 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/Vulnerability Management/ SOC platform management/ Automation/ Asset Integration/ Threat Intel Management/Threat Hunting. · Round the clock threat monitoring & detection · Analysis of any suspicious, malicious, and abnormal behavior. · Alert triage, Initial assessment, incident validation, its severity & urgency · Prioritization of security alerts and creating Incidents as per SOPs. · Reporting & escalation to stakeholders · Post-incident Analysis · Consistent incident triage & recommendations using playbooks. · Develop & maintain incident management and incident response policies and procedures. · Preservation of security alerts and security incidents artefacts for forensic purpose. · Adherence to Service Level Agreements (SLA) and KPIs. · Reduction in Mean Time to Detection and Response (MTTD & MTTR). Mandatory skill sets: Certified SOC Analyst (EC-Council), Computer Hacking Forensic Investigator (EC-Council), Certified Ethical Hacker (EC-Council), CompTIA Security+, CompTIA CySA+ (Cybersecurity Analyst), GIAC Certified Incident Handler (GCIH) or equivalent. Product Certifications (Preferred): - Product Certifications on SOC Security Tools such as SIEM/Vulnerability Management/ DAM/UBA/ SOAR/NBA etc. Preferred skill sets: SOC - Splunk Years of experience required: 2-5 Years Education qualification: B.Tech/MCA/MBA with IT background/ Bachelor’s degree in Information Technology, Cybersecurity, Computer Science a Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills SOC Operations Optional Skills SoCs Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 6 days ago
4.0 - 9.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Mindera At Mindera , we craft software with people we love. We're a collaborative, global team of engineers who value open communication, great code, and building impactful products. We're looking for a talented C#/.NET Developer to join our growing team in Gurugram and help us build scalable, high-quality software systems. Requirements What You'll Do Build, maintain, and scale robust C#/.NET applications in a fast-paced Agile environment. Work closely with product owners and designers to bring features to life. Write clean, maintainable code following SOLID and OOP principles. Work with SQL/NoSQL databases, optimizing queries and schema designs. Collaborate in a Scrum or Kanban environment with engineers around the world. Use Git for version control and participate in code reviews. Contribute to our CI/CD pipelines and automated testing workflows. Must-Have Skills What We're Looking For 4-9 years of hands-on experience with C# and .NET technologies. Solid understanding of Object-Oriented Programming (OOP) and clean code principles. Proven experience working with databases (SQL or NoSQL). Experience in an Agile team (Scrum/Kanban). Familiarity with Git and collaborative development practices. Exposure to CI/CD pipelines and test automation. Nice-to-Have Skills Experience with Rust (even hobbyist experience is valued). Background working with Python or Scala for Spark-based applications. Hands-on with Docker and container-based architecture. Familiarity with Kubernetes for orchestration. Experience working with Apache Airflow for data workflows. Cloud experience with Google Cloud Platform (GCP) or Microsoft Azure. Benefits We Offer Flexible working hours (self-managed) Competitive salary Annual bonus, subject to company performance Access to Udemy online training and opportunities to learn and grow within the role About Mindera At Mindera we use technology to build products we are proud of, with people we love. Software Engineering Applications, including Web and Mobile, are at the core of what we do at Mindera. We partner with our clients, to understand their product and deliver high performance, resilient and scalable software systems that create an impact in their users and businesses across the world. You get to work with a bunch of great people, where the whole team owns the project together. Our culture reflects our lean and self-organisation attitude. We encourage our colleagues to take risks, make decisions, work in a collaborative way and talk to everyone to enhance communication. We are proud of our work and we love to learn all and everything while navigating through an Agile, Lean and collaborative environment. Follow our Linkedln page - https://tinyurl.com/minderaindia Check ot our Blog: http://mindera.com/ and our Handbook: http://bit.ly/MinderaHandbook Our offices are located: Aveiro, Portugal | Porto, Portugal | Leicester, UK | San Diego, USA | San Francisco, USA | Chennai, India | Bengaluru, India
Posted 6 days ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Pro-Manage Pro-Manage is a pioneering tech- and AI-based Marketing-as-a-Service platform developed to serve multi-location enterprise clients increase, manage and measure digital engagement opportunities with prospects and customers, leveraging several valuable business and technology partnerships with global leaders such as Google GMB/Chat, Microsoft Bing, Meta WhatsApp/Instagram, Knowlarity Cloud telephony and others. Pro-Manage is a powerful and sophisticated marketing SaaS platform that has attracted highly regarded brands as its clients: Indian Oil, Shriram Finance, Apollo Pharmacy, Naturals, Sterling Resorts, Muthoot Finance, Aadhaar Housing Finance, Shriram Transport Finance, City Union Bank, O2 Spa, Page 3 Salons, Urban Nomads, CADD Centre and more. ProManage is developed using advanced technologies, design thinking, continuous discovery, agile development and product management methodologies. The cross-functional Pro-Manage team uses integrated customer-centric product-led marketing and growth paradigms designed to attract and retain targeted customers, maximizing life-time value and minimizing customer acquisition efforts and costs. Pro-Manage is developed and offered by Sulekha, one of India’s largest digital business and consumer brands that has transformed local services ecosystem through a AI-based need fulfillment and monetization platform that generates millions of qualified, parameterized service requests to local service SMBs every day in 40 cities. Pro-Manage aspires to be the dominant, industry-leading and technologically the most sophisticated marketing SaaS platform in India with 500+ enterprise customers in the next two years. Pro-Manage and Sulekha have three of the most well-regarded firms as its investors: Norwest Venture Partners (Palo Alto, US), Mitsui (Tokyo), and GIC (sovereign wealth fund of Singapore). Roles & responsibilities: Design, Architect, and Develop software solutions based on suitable design patterns. Adopt new and emerging technologies to provide solutions. Collaborate with cross-functional teams to understand business requirements and translate them into machine learning problems. Work alongside data scientists and software engineers to integrate machine learning models into production systems. Identify technical issues and provide resolution. Design, develop, and implement machine learning models for various business applications, including predictive analytics, recommendation systems, natural language processing, and computer vision. Stay up to date with the latest advancements in AI and machine learning techniques. Ensure that machine learning pipelines and systems are scalable, efficient, and robust. Excellent communication and interpersonal skills. Qualifications BE / Master’s in computer science or any degree. Critical Skills Proven experience as a Machine Learning Engineer, AI Engineer, or similar role (typically 2+ years). Experience writing software in Python, Scala, R, or similar. Familiarity with cloud platforms like AWS, Google Cloud or Microsoft Azure for deploying models at scale. Solid understanding of machine learning algorithms (supervised and unsupervised learning, deep learning, reinforcement learning, etc.). Experience with data structures, algorithms, and software design. Experience in using any LLM models in real-time projects. Proven ability to work as part of cross-functional delivery teams in an Agile delivery environment. Work in a dynamic, collaborative, non-hierarchical environment where your talent is valued over your job title or years of experience. Strong problem-solving skills with the ability to think critically and creatively. Ability to break down complex business problems into machine learning solutions. Desired skills Experience with machine learning libraries and frameworks. Expertise in data manipulation and analysis. Experience with big data technologies. STS (Speech to Text & Text to Speech) #ASR (Automated Speech Recognition) Engine Exposure. Great OOPS skills, including strong design patterns knowledge. Familiarity with relational databases, NoSQL. Out-of-the-box thinking, and in-depth problem-solving. Personal skills Result- and outcome-orientation High degree of enthusiasm, dynamism, and drive to succeed Capacity for hard work and perseverance Maturity to collaborate with peers to achieve overall success Capability to nurture and motivate direct reports to high success
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough