Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Job Description for Databricks Platform Administrator Experience Level: 8-12 Yrs Job Title: Databricks Platform Administrator Roles and Responsibilities A Databricks Platform Administrator is a crucial role responsible for the effective design, implementation, maintenance, and optimization of the Databricks Lakehouse Platform within an organization. This individual ensures the platform is scalable, performant, secure, and aligned with business objectives, providing essential support to data engineers, data scientists, and analysts. Job Summary: The Databricks Platform Administrator is a key member of our data and analytics team, responsible for the overall administration, configuration, and optimization of the Databricks Lakehouse Platform. This role ensures the platform's stability, security, and performance, enabling data engineering, data science, and machine learning initiatives. The administrator will work closely with various cross-functional teams to understand requirements, provide technical solutions, and maintain best practices for the Databricks environment. Key Responsibilities: 1. Provision and configure Databricks workspaces, clusters, pools, and jobs across environments. 2. Create catalogs, schemas, access controls, and lineage configurations. 3. Implement identity and access management using account groups, workspace-level permissions, and data-level governance. 4. Monitor platform health, cluster utilization, job performance, and cost using Databricks admin tools and observability dashboards. 5. Automate workspace onboarding, schema creation, user/group assignments, and external location setup using Terraform, APIs, or CLI. 6. Integrate with Azure services like ADLS Gen2, Azure Key Vault, Azure Data Factory, and Azure Synapse. 7. Support model serving, feature store, and MLflow lifecycle management for Data Science/ML teams. 8. Manage secrets, tokens, and credentials securely using Databricks Secrets and integration with Azure Key Vault. 9. Define and enforce tagging policies, data masking, and row-level access control using Unity Catalog and Attribute-Based Access Control (ABAC). 10. Ensure compliance with enterprise policies, security standards, and audit requirements. 11. Coordinate with Ops Architect, Cloud DevOps teams for network, authentication (e.g., SSO), and VNET setup. 12. Troubleshoot workspace, job, cluster, or permission issues for end users and data teams. Preferred Qualifications: · Databricks Certified Associate Platform Administrator or other relevant Databricks certifications. · Experience with Apache Spark and data engineering concepts. · Knowledge of monitoring tools (e.g., Splunk, Grafana, Cloud-native monitoring). · Familiarity with data warehousing and data lake concepts. · Experience with other big data technologies (e.g., Hadoop, Kafka). · Previous experience leading or mentoring junior administrators.
Posted 4 weeks ago
14.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Location: Bangalore - Indraprastha, India Thales people architect identity management and data protection solutions at the heart of digital security. Business and governments rely on us to bring trust to the billons of digital interactions they have with people. Our technologies and services help banks exchange funds, people cross borders, energy become smarter and much more. More than 30,000 organizations already rely on us to verify the identities of people and things, grant access to digital services, analyze vast quantities of information and encrypt data to make the connected world more secure. Present in India since 1953, Thales is headquartered in Noida, Uttar Pradesh, and has operational offices and sites spread across Bengaluru, Delhi, Gurugram, Hyderabad, Mumbai, Pune among others. Over 1800 employees are working with Thales and its joint ventures in India. Since the beginning, Thales has been playing an essential role in India’s growth story by sharing its technologies and expertise in Defence, Transport, Aerospace and Digital Identity and Security markets. We are seeking a skilled and experienced DevOps Engineer Tech Lead to join our team, responsible for the management and optimization of our cloud environments across AWS, GCP. This role will primarily focus on building and maintaining cloud infrastructures for Cloud WAF development teams. The ideal candidate will possess deep knowledge of various cloud platforms and will be responsible for ensuring smooth operation and scalability of development and production environments. Responsibilities : Work with development teams to ensure that applications have scalability and reliability built-in from day one- agile is second nature to you and you’re excited to work in scrum teams and represent the DevOps perspective Participate in software design discussion to improve scalability, service reliability, cost, and performance- you’ve helped create services that are critical to their customers’ success Deploy automation for provisioning and operating infrastructure at large scale. You are experienced in Infrastructure as Code concepts and have put them into production Partner with teams to improve CI/CD processes and technology - Helping teams in delivering value early is what you strive for Mentor members of the team on large scale cloud deployments- you’re an expert in deploying in the cloud and can bring a teaching mindset to help others benefit from your experience Drive the adoption of observability practices and a data-driven mindset- you love metrics, graphs, and gaining a deep understanding of why things happen in a system, helping others gain visibility into the things they build Setup processes like on-call rotations and runbooks to continue supporting the MS’s owned by the development teams while finding ways to reduce the time to resolution and improve the reliability of services Required Qualifications: Bachelors/Masters degree in Computer Science 14+ years of industry experience in engineering 7+ years of working with Microservices architectures on Kubernetes HandsOn experience with container native tools like Helm, Istio, Vault running in Kubernetes Experience with public cloud AWS at medium to large scale Proficient in CI/CD platforms like GitlabCI, Jenkins, etc… Drive enhancement of observability by implementing distributed tracing, logging standards, dashboard standardization, profiling, and other relevant practices to meet our Service Level Objectives (SLOs) HandOn experience with Monitoring tools - Prometheus, Grafana etc. Expertise in designing, analyzing, and troubleshooting large-scale distributed systems Experience with, Kafka, Postgres, MYSQL tuning and performance is a plus Desired Qualifications: Fluent Scripting skills preferably Python or Bash Experience with public cloud GCP at medium to large scale Knowledge of operating systems (processes, threads, concurrency, etc) Experience working with Unix/Linux systems from kernel to shell and beyond At Thales we provide CAREERS and not only jobs. With Thales employing 80,000 employees in 68 countries our mobility policy enables thousands of employees each year to develop their careers at home and abroad, in their existing areas of expertise or by branching out into new fields. Together we believe that embracing flexibility is a smarter way of working. Great journeys start here, apply now!
Posted 4 weeks ago
8.0 years
0 Lacs
India
Remote
Azure Devops Engineer Remote 6 Months Contract The DevOps Engineer is responsible for the system to work as front liners. Their role is to define strategy and lead the implementation of DevSecOps pipelines. Orchestrates build and release pipelines and ensure seamless application promotion for all the digital squads. Provide necessary support to Agile coach, scrum master, development squads and automate complete rollouts including non-production and production for all the applications which includes APIs, Database promotions as well. Also ensure the container platform configuration setup and availability in azure cloud environment. • Uses his/her technical expertise and experience to contribute to all sprint events (planning, refinements, retrospectives, demos) • Consults with the team about what is needed to fulfil the functional and non-functional requirements of the IT product to be build and released • Define and orchestrate the DevOps for the IT product, Enable automated the unit test in line with the customer’s wishes and IT area’s internal ambitions and reviews colleagues’ IT products. • Define, designs and enable automated builds and automated tests IT products (functional, performance, resilience and security tests). • Performs Life Cycle Management (including decommissioning) for IT products under management • Define and Improves the Continuous Delivery process • Sets up the IT environment, deploys the IT product on the IT infrastructure and implements the required changes • Sets up monitoring of IT product usage by the customer Operating Environment, Framework and Boundaries, Working Relationships • The DevOps Engineer is responsible for the system to work as front liners. Their role is to define strategy and implementation of DevSecOps pipelines. Orchestrates build and release pipelines and ensure seamless application promotion for all the digital squads. Provide necessary support to Agile coach, scrum master, development squads and automate complete rollouts including non-production and production for all the applications which includes APIs, Database promotions as well. Also ensure the container platform configuration setup and availability in azure cloud environment. • Works within a multidisciplinary team or in an environment in which multidisciplinary teamwork is carried out. • Is primarily responsible for the automated non-production and production rollouts (or technical configuration) of software applications. • The range of tasks includes the following: o The analysis and design of the DevOps solution for any application (or the technical configuration); o Coding and review the pipelines and/or package integration in one programming languages, scripting languages and frameworks: Azure/AWS/Cloud Pak DevOps services Pipeline creation using the templates and enhance the existing templates based on the needs Integration with various DevOps tools like SonarQube, Veracode, Twistlock, Ansible, Terraform, hashicorp vault. Azure test plan setup and configuration with pipelines Cloud based deployments for Springboot Java, reactjs, nodejs, .Net core and using native K8 and AKS/EKS/OpenShift Experience in setting up Kubernetes cluster with ingress controllers (nginx and nginx+) Experience in python, shell scripting. Logging and monitoring using Splunk, EFK and ELK Middleware on-premises automated deployments for WebSphere, Jboss, BPM, IIS and IIB Expert in OS - RHEL, CentOS, Ubuntu Experience in Liquibase/Flyway for DB automations. o Basic application development knowledge for cloud native and traditional apps o API Gateway ad API deployments Database systems, with knowledge of SQL and NoSQL stores (e.g. MySQL, Oracle, MongoDB, Couchbase, etc.) o Continuous Delivery (Compile, Build, Package, Deploy); o Test-Driven Development (TDD) and test automation (e.g. regression, functional and integration tests); debugging and profiling; o software configuration management and version control. o Work in an agile/scrum environment, meeting sprint commitments and contributing to the agile process o Maintain traceability of testing activities • 8 to 10 years of overall experience and 6 to 7 years’ experience as a DevOps Engineer in defining the solution and implementing it common on-premises and cloud platforms with scripting languages and frameworks expertise • Expert in Azure DevOps services • Hands-on in Pipeline creation using the templates and enhance the existing templates based on the needs • Able to perform Integration by coding the templates with various DevOps tools like SonarQube, Veracode, Twistlock, Ansible, Terraform, hashicorp vault and UCD. • Implement Azure test plan setup and configuration with pipelines • Automate cloud-based deployments for Springboot Java, reactjs, nodejs, .Net core and using native K8 and AKS/EKS/OpenShift • Experience in setting up Kubernetes cluster with ingress controllers (nginx and nginx+) • Expert in Python/shell scripting • Expert in DB automation tools (Flyway/Liquibase) • Experience in Azure files and synz solution implementation • Experience in Logging and mentoring using Splunk, EFK and ELK • Experience in Middleware on-premises automated deployments for WebSphere, Jboss, BPM, IIS and IIB • Expert in OS - RHEL, CentOS, Ubuntu • Knowledge on IBM cloudpak using redhat OpenShift • Basic application development knowledge for cloud native and traditional apps • Experience in API Gateway ad API deployments • Nice to have knowledge of immutable infrastructure, infrastructure automation and provisioning tools • Strong understanding of Agile methodologies • Strong communication skills with ability to communicate complex technical concepts and align organization on decisions • Sound problem-solving skills with the ability to quickly process complex information and present it clearly and simply • Utilizes team collaboration to create innovative solutions efficiently • Passionate about technology and excited about the impact of emerging/disruptive technologies • Believes in culture of brutal transparency and trust • Open to learning new ideas outside scope or knowledge
Posted 4 weeks ago
8.0 years
0 Lacs
India
Remote
Job Title: Senior Configurator Lead (Veeva PromoMats) Experience: 8+ Years Work Mode: Remote Industry: IT / Life Sciences / Pharmaceutical Notice Period: Immediate Joiner / 15 Days Location: Hyderabad JD: Overview: The Senior Configurator Lead (Veeva PromoMats) is responsible for configuring and managing the Veeva Vault PromoMats platform to support the creation, review, and approval of promotional materials. This role involves working closely with business stakeholders to understand requirements, configure the system, and ensure seamless integration with other systems. Key Responsibilities: · Configuration Management: Configure Veeva Vault PromoMats to meet business requirements, including setting up workflows, document types, and metadata fields. · Requirement Gathering: Collaborate with business stakeholders to gather and document requirements, translating them into technical specifications. · System Integration: Ensure seamless integration of Veeva PromoMats with other enterprise systems, including CRM and DAM systems. · Testing and Validation: Conduct thorough testing of configurations and workflows to ensure they meet business needs and comply with regulatory requirements. · User Training and Support: Provide training and support to end-users, ensuring they understand how to use the system effectively. · Documentation: Maintain detailed documentation of configurations, workflows, and system changes. · Continuous Improvement: Identify opportunities for system enhancements and process improvements to optimize the use of Veeva PromoMats. · Strategic Planning o Defined and executed configuration strategies aligned with client goals and compliance standards. o Created roadmaps and milestones to ensure timely and high-quality deliverables. · Team Mentorship o Guided junior configurators on Veeva Vault best practices and PromoMats configurations. o Conducted regular training and knowledge-sharing sessions to enhance team skills. · Cross-Functional Collaboration o Acted as a liaison between business, QA, and technical teams for smooth project execution. o Facilitated Agile ceremonies like daily stand-ups and sprint planning. · Quality & Compliance o Ensured configurations met FDA 21 CFR Part 11 and other regulatory requirements. o Introduced peer reviews and audits to maintain configuration quality. · Conflict Resolution & Team Morale o Addressed team conflicts constructively and promoted open communication. o Recognized team achievements to boost morale and motivation. · Agile Delivery Management o Led Agile processes and tracked progress using tools like JIRA and Veeva Vault QMS. o Balanced workload distribution to optimize team performance. · Stakeholder Communication o Provided regular updates to leadership and clients on progress, risks, and mitigation plans. o Translated complex requirements into actionable tasks for the team. Requirements: · Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. · Experience: Proven experience in configuring and managing Veeva Vault PromoMats or similar content management systems. · Skills: Strong understanding of content management and regulatory compliance processes, excellent problem-solving skills, and the ability to work collaboratively with cross-functional teams. · Technical Proficiency: Familiarity with system integration, workflow configuration, and metadata management. · Communication: Excellent verbal and written communication skills, with the ability to explain technical concepts to non-technical stakeholders. Preferred Qualifications: · Experience with other Veeva Vault applications. · Knowledge of pharmaceutical or life sciences industry regulations. · Certification in Veeva Vault PromoMats
Posted 4 weeks ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About us Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with The Consumer Products Center of Expertise collaborates with Bain’s global Consumer Products Practice leadership, client-facing Bain leadership and teams, and with end clients on development and delivery of Bain’s proprietary CP products and solutions. These solutions aim to answer strategic questions of Bain’s CP clients relating to brand strategy (consumer needs, assortment, pricing, distribution), revenue growth management (pricing strategy, promotions, profit pools, trade terms), negotiation strategy with key retailers, optimization of COGS etc. You will work as part of the team in CP CoE comprising of a mix of Director, Managers, Projects Leads, Associates and Analysts working to implement cloud-based end-to-end advanced analytics solutions. Delivery models on projects vary from working as part of a CP Center of Expertise, broader global Bain case team within the CP ringfence, or within other industry CoEs such as FS / Retail / TMT / Energy / CME / etc with BCN on need basis The AS is expected to have a knack for seeking out challenging problems and coming up with their own ideas, which they will be encouraged to brainstorm with their peers and managers. They should be willing to learn new techniques and be open to solving problems with an interdisciplinary approach. They must have excellent coding skills and should demonstrate a willingness to write modular, reusable, and functional code. What you’ll do Collaborate with data scientists working with Python, LLMs, NLP, and Generative AI to design, fine-tune, and deploy intelligent agents and chains-based applications. Develop and maintain front-end interfaces for AI and data science applications using React.js / Angular / Nextjs and/or Streamlit/ DASH, enhancing user interaction with complex machine learning and NLP-driven systems. Build and integrate Python-based machine learning models with backend systems via RESTful APIs using frameworks like FastAPI / Flask or Django. Translate complex business problems into scalable technical solutions, integrating AI capabilities with robust backend and frontend systems. Assist in the design and implementation of scalable data pipelines and ETL workflows using DBT, PySpark, and SQL, supporting both analytics and generative AI solutions. Leverage containerization tools like Docker and utilize Git for version control, ensuring code modularity, maintainability, and collaborative development. Deploy ML-powered and data-driven applications on cloud platforms such as AWS or Azure, optimizing for performance, scalability, and cost-efficiency. Contribute to internal AI/ML Ops platforms and tools, streamlining model deployment, monitoring, and lifecycle management. Create dashboards, visualizations, and presentations using tools like Tableau/ PowerBI, Plotly, and Seaborn to drive business insights. Proficient with Excel, and PowerPoint by showing proficiency in business communication through stakeholder interactions. About you A Master’s degree or higher in Computer Science, Data Science, Engineering, or related fields OR Bachelor's candidates with relevant industry experience will also be considered. Proven experience (2 years for Master’s; 3+ years for Bachelor’s) in AI/ML, software development, and data engineering. Solid understanding of LLMs, NLP, Generative AI, chains, agents, and model fine-tuning methodologies. Proficiency in Python, with experience using libraries such as Pandas, Numpy, Plotly, and Seaborn for data manipulation and visualization. Experience working with modern Python frameworks such as FastAPI for backend API development. Frontend development skills using HTML, CSS, JavaScript/TypeScript, and modern frameworks like React.js; Streamlit knowledge is a plus. Strong grasp of data engineering concepts – including ETL pipelines, batch processing using DBT and PySpark, and working with relational databases like PostgreSQL, Snowflake etc. Good working knowledge of cloud infrastructure (AWS and/or Azure) and deployment best practices. Familiarity with MLOps/AI Ops tools and workflows including CI/CD pipelines, monitoring, and container orchestration (with Docker and Kubernetes). Good-to-have: Experience in BI tools such as Tableau or PowerBI, Good-to-have: Prior exposure to consulting projects or CP (Consumer Products) business domain. What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.
Posted 4 weeks ago
0 years
0 Lacs
Bangalore Urban, Karnataka, India
On-site
TCS Hiring !! Role** EDW , Data Vault Modelling, SQL, Informatica or other ETL tools, Exp -10+ Desired Experience Range** Min 10+ yrs experience in Data architecture and Data Vault modelling Please read Job description before Applying NOTE: If the skills/profile matches and interested, please reply to this email by attaching your latest updated CV and with below few details: Name: Contact Number: Email ID: Highest Qualification in: (Eg. B.Tech/B.E./M.Tech/MCA/M.Sc./MS/BCA/B.Sc./Etc.) Current Organization Name: Total IT Experience 10+ Current CTC Expected CTC Notice period Whether worked with TCS - Y/N Location Bangalore/ Hyderabad Deep understanding and experience of the data and data vault modelling concepts and Data Vault Speed Tool. Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Basic Understanding of various systems including ERP (SAP), CRM (Salesforce, Eloqua etc.) from the data structure perspective. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Should be well versed with SQLs from writing complex queries, analytical functions Hands-on modeling, design, configuration, installation, performance tuning Lead in designing a solution to meet the stated requirements and work closely with architecture team and business team for presenting in solution review boards Good to have experience on the AWS Cloud and related technologies including API, Rest Demonstrate strong communication, analytical and organizational skills Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models.
Posted 4 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Summary You are a passionate and technical client-obsessed Solution Specialist who understands emerging energy industry trends and implications on utilities and their clients. You are a trusted advisor to the client, providing technical expertise in the planning, requirements management, design, build (including custom software development) and implementation of GE Vernova GridOS DERMS solutions to meet client needs. Job Description What will you as our new DevOps security Specialist work on? Engage with our utility clients to understand the impacts of distributed energy resources (DERs) on the distribution system by providing our utility clients with solutions to manage and plan for the emergence of more DERs. Work collaboratively with a strong team of Technical Leads and Solution Specialists to provide input on solution architecture based on client needs and solution capabilities. Help define success criteria and contribute to solution diagrams for the project. Engage with our team to deploy GE GridOS DERMS solutions to client environments to support project use cases and DER planning scenarios. Document and design client-specific solution deployment requirements, capturing acceptance criteria and necessary features to meet client business needs. Collaborate with other members of the Solutions team to expand our solution consulting and delivery practice, build standards of excellence, and continuously deliver innovative solution offerings for clients. Collaborate and contribute to identifying project delivery risks, recommending potential mitigation strategies, in collaboration with Project Manager Required Skills Education. Bachelor’s degree in computer science, or engineering , masters degree preferred Experience. 5 to 8 years of enterprise application development, deployment, and integration experience Java, Microsoft .Net, or other enterprise SW development experience an asset. Strong hands-on experience with: AWS, Azure, and / or GCP Cloud Platforms and Technologies. RHEL 8+ Linux OS and shell scripting. Puppet, Ansible, Terraform, ArgoCD Docker, Kernel-based VM PostgresSQL, MariaDB, MongoDB Redis, Memcached Nginx, HA Proxy, Apache Python, JSON, YAML, Bash, Git Infrastructure as Code scripting with Terraform, Ansible or cloud native (ARM, Cloud Formation). Monitoring tools for statistical analysis and pro-active system tuning such as DataDog. GitHub, CI/CD pipeline design and scheduling with code quality stages spanning the lifecycle of continuous build & quality, continuous deploy, and continuous testing. Knowledge of cyber security best practices including authentication protocols, ACL rules, and identity management mechanisms. Knowledge of DevSecOps (static, dynamic, artifactory, code scanning). Build and deploy experience of Docker containers using Helm Charts and YAML configurations. Deployment and management of securely hardened technology frameworks including Kafka, and Kubernetes. Pipeline integration with DevOps Tools such as GIT, Docker, K8 containers, SonarQube, Nexus, Vault, test automation, Fortify, and Twistlock, Gitea, and Jenkins. Hands on experience in solution deployment with PDI and ArgoCD technologies. Agile/Lean principles such as SCRUM, Kanban, MVP. Windows and Linux Server, SQL Server, Active Directory, Azure Active Directory, and Infrastructure Automation Tools. Self-starter with the ability to work in a cross functional team-based environment. Excellent analytical skills and must be able to look at situations from different vantage points to make data-driven decisions and solve problems. Knowledge of security best practices including authentication protocols, ACL rules, and Identity Management mechanisms. Experience working with Azure, Windows and Linux Server, SQL Server, Active Directory, Azure Active Directory, Systems, and Infrastructure Automation Tools Excellent analytical skills and must be able to look at situations from different vantage points to make data-driven decisions and solve problems. Knowledge of Node.JS and Microservices. Knowledge on relational and non-relational databases. Triage and debug any issues that arise during testing and production. Automate testing across CI/CD and cloud infrastructure management. Knowledge of secure code development. You have strong understanding of the utilities industry vertical and what Distributed Energy Resource Managed Solutions (DERMS) are all about. Hands on Python enterprise application development Nice To Have Knowledge. You are highly familiar with emerging energy industry trends and implications on utility clients in DER management, coupled with how to manage requirements effectively, how to message the change effectively, and overall, how to manage client expectations. Excellence. You get things done within project deadlines, and with strong focus on quality. Positive attitude and a strong commitment to delivering quality work. Teamwork . You are a natural collaborator and demonstrate a “we before me” attitude. Self-starter with the ability to work in a cross functional team-based environment. Problem Solving . You can quickly understand and analyze various approaches and processes and are able to configure solutions to client needs given existing product functionality. You can drill down to the details, obtaining the right level of specificity for your team. You can creatively solve complex problems. Understanding of how to triage and debug any issues that arise during testing and production. Possesses excellent analytical and problem resolution skills. Communication . Strong written and verbal communication style. Can effectively share complex technical topics with various levels of audience. Growth Mindset You are deeply curious and love to ask questions. You’re a lifelong learner. Ability and desire to learn different skills outside of their domain of expertise. Client Focus You enjoy being in front of clients, listening to their needs. You are deeply focused on ensuring their success. You can create powerful user stories detailing the needs of your clients. Innovation A genuine interest in new tools and technology. You learn new software quickly without extensive documentation or hand holding. High enthusiasm with a sense of urgency to get things done. Additional Information Relocation Assistance Provided: Yes
Posted 4 weeks ago
0 years
0 Lacs
India
Remote
Lead Data Engineer (Remote for India only) Strong hands-on expertise in SQL, DBT and Python for data processing and transformation. Expertise in Azure data services (e.g., Azure Data Factory, Synapse, Event Hub) and orchestration tools. Strong experience with Snowflake – including schema design, performance tuning, and security model. Good understanding of DBT for transformation layer and modular pipeline design. Hands-on with Git and version control practices – branching, pull requests, code reviews. Understanding of DevOps/DataOps principles – CI/CD for data pipelines, testing, monitoring. Knowledge of data modeling techniques – Star schema, Data Vault, Normalization/Denormalization. Experience with real-time data processing architectures is a strong plus. Proven leadership experience – should be able to mentor team members, take ownership, make design decisions independently. Strong sense of ownership, accountability, and solution-oriented mindset. Ability to handle ambiguity and work independently with minimal supervision. Clear and confident communication (written and verbal) – must be able to represent design and architecture decisions. Lead the design and development of data pipelines (batch and real-time) using modern cloud-native technologies (Azure, Snowflake, DBT, Python). Translate business and data requirements into scalable data integration designs. Guide and review development work across data engineering team members (onshore and offshore). Define and enforce best practices for coding, testing, version control, CI/CD, data quality, and pipeline monitoring. Collaborate with data analysts, architects, and business stakeholders to ensure data solutions are aligned with business goals. Own and drive end-to-end data engineering workstreams – from design to production deployment and support. Provide architectural and technical guidance on platform setup, performance tuning, cost optimization, and data security. Drive data engineering standards and reusable patterns across projects to ensure scalability, maintainability, and reusability of code and data assets. Define and oversee data quality frameworks to proactively detect, report, and resolve data issues across ingestion, transformation, and consumption layers. Act as a technical go-to team member for complex design, performance, or integration issues across multiple teams and tools (e.g., DBT + Snowflake + Azure pipelines). Contribute to hand on development as well for the ned to end integration pipelines and workflows. Document using Excel, Word, or tools like Confluence.
Posted 4 weeks ago
8.0 years
0 Lacs
India
Remote
Job Title: Senior .NET Developer Location: Remote Duration: 6 Months Contract Work Hours: 12:00 PM – 9:00 PM IST (8 Hours) Budget: ₹1.10 - 1,20,000 / Month - Fixed Experience Required: 7–8+ Years Notice Period: 15–30 Days About the Role: We are seeking a highly skilled and hands-on Senior .Net Developer with extensive experience in .Net Core, Azure Cloud Services, and Azure DevOps. This is a client-facing role that requires strong communication skills and deep technical expertise. You will be instrumental in developing scalable applications, integrating third-party APIs, and working across the entire tech stack. This position is ideal for individuals who thrive in a fast-paced, agile environment and are passionate about delivering robust software solutions with a cloud-first approach. Key Responsibilities: Develop, enhance, and maintain application features using .Net Core 5/6+, C#, REST APIs, T-SQL, and AngularJS/ReactJS. Lead or participate in designing architecture and developing end-to-end solutions. Handle application support and integrate APIs with third-party services. Collaborate with cross-functional teams to define and deliver new features. Identify and resolve technical issues, implementation dependencies, and project risks. Adhere to Agile methodologies (Scrum, Jira) and ensure timely delivery of project milestones. Document technical solutions and create reusable, testable, and efficient code. Write unit tests using X-Unit or MS-Test frameworks. Must-Have Skills: Minimum 6+ years of hands-on experience in C#, .Net Core (3.0/6.0+), Entity Framework, and SQL. At least 2 years of experience working with Microsoft Azure Cloud Services . Strong knowledge of Azure services: Azure Messaging (Service Bus, Event Grid, Event Hub) Azure Storage (Blob, Table, Queue) Azure Functions & Durable Functions Azure DevOps (Classic/YAML CI/CD pipelines) Proficiency in Microservices architecture and SOA. Expertise in MS SQL Server (Stored Procedures, Views, Functions, Cursors, Packages). Strong debugging, analytical, and problem-solving abilities. Prior experience in application support and 3rd party API integration. Strong understanding of software development life cycle (SDLC). Nice-to-Have / Secondary Skills: JavaScript frameworks (ReactJS, Angular, jQuery). API Management using APIM. Familiarity with Azure Monitoring, Application Insights, Key Vault, and SQL Azure. Hands-on experience with Git, Docker, Kubernetes. Knowledge of Azure Container Apps and Azure Container Registry.
Posted 4 weeks ago
12.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Company Overview Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 59 offices in 37 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. We are passionate about achieving results for our clients (our public clients have historically outperformed the stock market 4:1). We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents. Department Overview The Reporting and Analytics CoE is an integral part of Bain’s global PPK group (Product, Practice and Knowledge). The team is responsible for bringing various functional reporting pieces under one umbrella to provide integrated and centralized reporting to the firm leadership on various aspects of the firm performance. Position Summary PPK Reporting & Analytics CoE has experienced strong growth as an internal business intelligence and analytics business over the past five years. Given the expansion of the team, we are looking for a Senior Specialist to join our Global Financial Planning and Analysis teamlet to help drive its analytical capabilities to full potential over the next phase. Specific Responsibilities * Support the development of reporting solutions using Alteryx, Power BI/ Tableau, and similar tools. Focus on generating actionable insights under the guidance of supervisor, ensuring that outputs meet the needs of stakeholders. * Assist in automating reporting processes to improve operational efficiency and support routine business activities. * Take responsibility for independently delivering complex tasks or smaller components within larger projects. Ensure tasks are completed on time and meet quality standards, while receiving direction and feedback from the supervisor. * Participate in cross-functional teams, offering support in analytics and business intelligence. Help gather and prepare data, and deliver insights that assist Managers and Senior Managers in shaping solutions and providing recommendations * Work on implementing best practices in automation and business intelligence reporting within the team. Ensure consistency and quality in deliverables, adhering to established standards. * Contribute to innovation efforts by bringing ideas and assisting in the execution of projects that align with the firm’s goals. * Actively engage in promoting a positive and collaborative team environment. Support initiatives led by Managers that reinforce the values and culture of the team. * Provide guidance and mentorship to junior team members, helping them develop their skills and align their work with project goals. * Assist in the recruitment process and contribute to the smooth onboarding of new team members, ensuring they quickly become effective members of the team. Qualifications * 7-9 years (graduates) or 5-7 years (post-graduates) of relevant experience in data analytics, business intelligence, and reporting solutions development, with a strong focus on leveraging tools like Alteryx, Power BI and Tableau. * Proficiency in using data visualization tools (e.g., Tableau, Power BI) with a solid understanding of best practices in data-driven reporting. * Competence in SQL, with experience in optimizing queries that meet project-specific needs. * Prior experience working in the Finance function or a basic understanding of finance function fundamentals is preferred * Concentration in a quantitative discipline such as Statistics, Mathematics, Engineering, Computer Science, Econometrics, Business Analytics. * Strong problem-solving skills with the ability to develop innovative solutions for complex data analytics challenges and contribute to the continuous improvement of reporting processes. * Understanding of data structures, algorithms, and basic programming principles, with the ability to apply these in the context of data analytics and reporting solutions. * Demonstrated ability to contribute to the development and deployment of reporting solutions and data analytics products that enhance business intelligence and support decision-making * Strong interpersonal and communication skills, with the ability to collaborate effectively with team members and articulate technical aspects of data solutions and automation techniques to colleagues. * Demonstrated curiosity, proactivity, and critical thinking, with a commitment to learning about emerging trends in data analytics and AI, and applying these insights to improve project outcomes. * Experience working in an agile environment, with the ability to handle changing priorities and contribute effectively to project tasks in a fast-paced setting. * Ability to collaborate with stakeholders within a department or practice area, across different offices and regions, ensuring that analytics solutions are aligned with the specific business needs of the teams involved. * Good to have a working knowledge of Python, particularly in automating reporting processes and data processing tasks and experience in implementing straightforward AI models for specific tasks using tools like ChatGPT.
Posted 4 weeks ago
0.0 - 5.0 years
18 - 24 Lacs
Indira Nagar , Lucknow, Uttar Pradesh
Remote
The State Head is responsible for driving business growth, operational excellence, and team leadership across the assigned state. This role requires a strategic thinker with strong leadership, sales, and operational experience to manage end-to-end business activities, ensure achievement of targets, and represent the company in key state-level engagements. Experience with key management systems (e.g., AWS KMS, Thales, Azure Key Vault), including configuring encryption policies, managing cryptographic keys, and securing sensitive financial data. Familiarity with financial system controls, including record locking, data encryption, and audit-compliant access management using key-based security systems Hands-on experience in locking financial periods, securing journal entries, and applying user access controls in systems such as SAP, Oracle Financials, or QuickBooks Roles & Responsibilities : ● Establish, Manage Relationships & Engagement with the Distributor & ASM ● Oversight of team – organises resources, sets goals, calls out strategy to ASM on aday-to-day basis● Managing day-to-day sales targets ● Build strong relationship with the sales team by growing to ensure operationalcohesion & effective sales foundation for future growth ● Taking weekly and monthly calls with Distributor & ASM ● Market billings through FOS ● Identify and drive continuous improvements and initiatives ● Coach & mentor Team lead so that they can manage their teams better ● Hiring of Manager,ASM and ZSMEducation & Experience ● Essential Qualifications: Graduate / Post-Graduate ● Desirable Qualifications: English and Hindi language proficiency ● Minimum of 6+ years of experience in the Regional or Zonal sales ● Excellent written and verbal communication ● Strong comprehensive and analytical abilities ● Software Knowledge: Excel, Google Sheet & Powerpoint ● Managing & measuring work● Travelling across the state is mandatory Preference : Telecom Sector (Experience Required - Locking Keys Software) Job Type: Full-time Pay: ₹1,800,000.00 - ₹2,400,000.00 per year Benefits: Flexible schedule Health insurance Internet reimbursement Life insurance Provident Fund Work from home Schedule: Day shift Supplemental Pay: Performance bonus Yearly bonus Ability to commute/relocate: Indira Nagar, Lucknow, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Education: Master's (Preferred) Experience: 7year: 5 years (Preferred) Language: English (Preferred) English,Hindi (Preferred) Location: Indira Nagar, Lucknow, Uttar Pradesh (Preferred) Shift availability: Day Shift (Preferred) Willingness to travel: 50% (Preferred) Work Location: In person Application Deadline: 12/07/2025 Expected Start Date: 20/07/2025
Posted 4 weeks ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Location - Pune, Mumbai, Bangalore & Chennai Experience: 10+ years Max CTC : 18 LPA Notice period: Immediate to 30 Days Shift Timing : 5 PM to 2 AM Primary Skills : Oracle DBA, RAC, Data Guard, Shell Scripting, Exadata Job Description: Minimum of 10 - 14 years Database Administration experience with administering and supporting multiple Oracle databases for performance critical, highly available systems on UNIX/Linux platform. L3 Support. Database Installation, Configuration, Upgradation, Migration, Monitoring, and Administration of production databases supporting mission-critical activities. A strong candidate to rapidly troubleshoot complex technical problems under pressure, implement scalable solutions, all while managing multiple requirements. Deep knowledge of Oracle Database concepts and strong performance tuning skills is a must. OEM Expertise to Monitor and Manage DB’s Ability to gather requirements, assess and recommend solutions to ensure High Availability and stability. Must have strong written and verbal skills. Experience in Oracle Security products like Data Vault, Key Vault, and database firewall. Design and Maintain database security to meet regulatory standards. Strong SQL Query tunning skills, RMAN backup Recovery Expert Managing RAC Databases, patching of Databases, Standby databases (Oracle Data Guard). Troubleshooting RAC or clusterware environments. Willing to work in a 24x7 environment and participate in on-call rotation. Exadata experience is desirable Automating routine DBA process experience and knowledge of scripting is required. Rotational Shift.
Posted 4 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Are you ready to unleash the potential of AI to redefine student achievement? At 2 Hour Learning, we're not content to merely adopt AI—we've encoded it into our DNA. We've re-engineered the learning experience from the ground up, harnessing AI and learning science breakthroughs to shatter the confines of the traditional school day. The result? Students who don't just meet the bar—they vault over it. The proof of our approach is undeniable: students using our platform consistently achieve 4s and 5s on five or more AP exams, demonstrate more than two years' growth on yearly MAP assessments, and shatter their own expectations of what they can achieve. If you're energized by the opportunity to deliver transformative results at an unprecedented scale, 2 Hour Learning is where you belong. You'll serve as the chief architect behind our revolutionary learning ecosystem for your dedicated school model, which spans multiple campuses. Fueled by an unwavering commitment to student success, you'll tap into the boundless potential of AI to craft hyper-personalized learning journeys that adapt to each student's unique needs, interests, and aspirations. Your role will be a perfect blend of data-driven strategy and hands-on engagement. You'll dive deep into learning analytics, uncovering key insights to drive exponential growth. But you won't just crunch numbers from afar—you'll be on the front lines, working directly with students to understand their experiences, challenges, and triumphs. This firsthand knowledge will be invaluable as you pinpoint the specific motivational levers and pedagogical strategies to shatter achievement ceilings across all campuses. You'll empower a dynamic team of learning engineers, data scientists, and instructional innovators to bring your vision to life. But more importantly, you'll be a champion for our students, ensuring that every decision, every innovation, and every strategy is laser-focused on improving their learning outcomes. This is a once-in-a-generation opportunity to apply AI to education and fundamentally redefine what's possible. Armed with the predictive power of advanced learning analytics, the ability to A/B test pedagogical hypotheses at scale, and an institutional mandate to push boundaries, you'll blaze new trails daily. Your canvas is vast, your toolkit unrivaled, and your mission critical. Because at 2 Hour Learning, we're not just using AI to boost grades—we're unlocking the full force of human potential, all without traditional classroom teachers. If you're ready to harness the most disruptive technology of our time to transform the most essential building block of our society, this is your moment. Audacious thinking, rigorous execution, and an unyielding commitment to student outcomes required. Defenders of the status quo need not apply. Join us on the frontlines of the AI revolution in education. Together, we won't just shape the future of learning—we'll create it. For more information on 2 Hour Learning, visit our website [https://2hourlearning.com/] and the Future of Education Instagram page [https://www.instagram.com/futureof_education/]. To see a school built around 2 Hour Learning, check out Alpha [https://alpha.school/]. What You Will Be Doing Architecting and continuously enhancing an AI-driven learning ecosystem that measurably outpaces traditional education, backed by tangible gains in student achievement data Engaging directly with students through virtual platforms to understand their learning experiences, challenges, and successes, using these insights to drive continuous improvement of the learning ecosystem Mining learning platform data to surface actionable insights and design high-impact academic interventions leveraging AI/ML, learning science, and motivational best practices Championing a culture of bold experimentation and evidence-based decision-making, harnessing data to unlock step-changes in students' growth trajectories Partnering with platform engineering, data science, and design teams to translate academic insights and student feedback into seamless product enhancements What You Won’t Be Doing Repackaging traditional education in an AI wrapper. This isn't about replicating classroom instruction via screens – we're fundamentally reimagining learning from the ground up. Analyzing data in isolation. You'll be expected to regularly engage with K-12 students, valuing their feedback as essential input from our paying customers. Waiting for consensus to push boundaries. You'll champion a bold vision and rally others around data-driven results. Sticking to conventional methods. You'll be free to experiment with innovative approaches to motivation, assessment, and instruction. Fearing AI's impact on education. Here, you'll harness AI as an exciting tool to revolutionize learning, not as a threat to be mitigated. Director Of Learning Key Responsibilities Drive innovation in AI-powered, teacher-less education to deliver exceptional student outcomes across multiple campuses. Blend data analytics with regular student engagement to continuously optimize our learning ecosystem, as measured by AP exam performance and MAP assessment growth. Basic Requirements Master's degree or higher in Educational Science, Learning Science, Psychology, Psychometrics, Instructional Design, or a related field Leadership experience in education or EdTech Experience applying AI technologies in an educational or professional context Experience designing and implementing AI systems for tasks such as content generation, data analysis, or adaptive learning Strong understanding of learning science principles and data-driven educational approaches Proven ability to communicate complex educational and technical concepts to diverse audiences Experience leading cross-functional teams and managing complex projects About 2 Hour Learning Education is broken, but 2 Hour Learning is proving it doesn’t have to be. They’re tearing down the outdated one-size-fits-all model and replacing it with AI-driven personalized learning that helps kids master academics in just two hours a day. With students consistently ranking in the top 1-2% nationally and the top 20% achieving an astonishing 6.5x growth, they’re proving that smarter learning is possible. At 2 Hour Learning, it’s talent and performance that matter. They offer a dynamic, on-campus and remote-friendly environment where innovators, educators, and AI specialists can be a part of fixing a broken school system. 2 Hour Learning is reprogramming learning for the AI era. Here’s How They’re Fixing It. There is so much to cover for this exciting role, and space here is limited. Hit the Apply button if you found this interesting and want to learn more. We look forward to meeting you! Working with us This is a full-time (40 hours per week), long-term position. The position is immediately available and requires entering into an independent contractor agreement with Crossover as a Contractor of Record. The compensation level for this role is $100 USD/hour, which equates to $200,000 USD/year assuming 40 hours per week and 50 weeks per year. The payment period is weekly. Consult www.crossover.com/help-and-faqs for more details on this topic. Crossover Job Code: LJ-4549-IN-Bengalur-DirectorofLear.016
Posted 4 weeks ago
2.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
In-depth knowledge of document review process in platforms like Veeva Vault PromoMats and/or MedComms and other platforms Demonstrated ability to communicate and troubleshoot challenges by collaborating with cross-functional colleagues, external vendors, and customers Ability to prioritize tasks and manage time effectively to ensure timely delivery of projects while handling multiple tasks without compromising quality Familiarity with different deliverable types across Medical affairs and commercial space Understanding of copyright management for references, images, etc, and ensuring that L-MACH tactics are PMC compliant Ensuring the tactics are PMC approved before they are routed for medical approval for global use and/or are uploaded on any repository Maintaining the tactics migration tracker from SharePoint to AEM Managing the accuracy of metadata while uploading the PMC assets onto content gallery and tactics onto Veeva Vault for approvals Ensuring the HE fulfilment requests are processed within the defined timeframe Desired Skills: 2-6 years of relevant experience. MLR Review process Effective communication and collaboration across internal and external stakeholders Time management and stakeholder management Good understanding of MA tactic types Copyright, license agreement management (PMC) Process adherence Expertise in routing platforms such as AEM, SharePoint, Veeva Vault, Capacity Planner Tool, Wrike etc
Posted 4 weeks ago
0.0 - 12.0 years
0 Lacs
Delhi, Delhi
On-site
About us Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with The Consumer Products Center of Expertise collaborates with Bain’s global Consumer Products Practice leadership, client-facing Bain leadership and teams, and with end clients on development and delivery of Bain’s proprietary CP products and solutions. These solutions aim to answer strategic questions of Bain’s CP clients relating to brand strategy (consumer needs, assortment, pricing, distribution), revenue growth management (pricing strategy, promotions, profit pools, trade terms), negotiation strategy with key retailers, optimization of COGS etc. You will work as part of the team in CP CoE comprising of a mix of Director, Managers, Projects Leads, Associates and Analysts working to implement cloud-based end-to-end advanced analytics solutions. Delivery models on projects vary from working as part of a CP Center of Expertise, broader global Bain case team within the CP ringfence, or within other industry CoEs such as FS / Retail / TMT / Energy / CME / etc with BCN on need basis The AS is expected to have a knack for seeking out challenging problems and coming up with their own ideas, which they will be encouraged to brainstorm with their peers and managers. They should be willing to learn new techniques and be open to solving problems with an interdisciplinary approach. They must have excellent coding skills and should demonstrate a willingness to write modular, reusable, and functional code. What you’ll do Collaborate with data scientists working with Python, LLMs, NLP, and Generative AI to design, fine-tune, and deploy intelligent agents and chains-based applications. Develop and maintain front-end interfaces for AI and data science applications using React.js / Angular / Nextjs and/or Streamlit/ DASH, enhancing user interaction with complex machine learning and NLP-driven systems. Build and integrate Python-based machine learning models with backend systems via RESTful APIs using frameworks like FastAPI / Flask or Django. Translate complex business problems into scalable technical solutions, integrating AI capabilities with robust backend and frontend systems. Assist in the design and implementation of scalable data pipelines and ETL workflows using DBT, PySpark, and SQL, supporting both analytics and generative AI solutions. Leverage containerization tools like Docker and utilize Git for version control, ensuring code modularity, maintainability, and collaborative development. Deploy ML-powered and data-driven applications on cloud platforms such as AWS or Azure, optimizing for performance, scalability, and cost-efficiency. Contribute to internal AI/ML Ops platforms and tools, streamlining model deployment, monitoring, and lifecycle management. Create dashboards, visualizations, and presentations using tools like Tableau/ PowerBI, Plotly, and Seaborn to drive business insights. Proficient with Excel, and PowerPoint by showing proficiency in business communication through stakeholder interactions. About you A Master’s degree or higher in Computer Science, Data Science, Engineering, or related fields OR Bachelor's candidates with relevant industry experience will also be considered. Proven experience (2 years for Master’s; 3+ years for Bachelor’s) in AI/ML, software development, and data engineering. Solid understanding of LLMs, NLP, Generative AI, chains, agents, and model fine-tuning methodologies. Proficiency in Python, with experience using libraries such as Pandas, Numpy, Plotly, and Seaborn for data manipulation and visualization. Experience working with modern Python frameworks such as FastAPI for backend API development. Frontend development skills using HTML, CSS, JavaScript/TypeScript, and modern frameworks like React.js; Streamlit knowledge is a plus. Strong grasp of data engineering concepts – including ETL pipelines, batch processing using DBT and PySpark, and working with relational databases like PostgreSQL, Snowflake etc. Good working knowledge of cloud infrastructure (AWS and/or Azure) and deployment best practices. Familiarity with MLOps/AI Ops tools and workflows including CI/CD pipelines, monitoring, and container orchestration (with Docker and Kubernetes). Good-to-have: Experience in BI tools such as Tableau or PowerBI, Good-to-have: Prior exposure to consulting projects or CP (Consumer Products) business domain. What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.
Posted 4 weeks ago
12.0 years
0 Lacs
Noida, Uttar Pradesh
Remote
Location: Noida, Uttar Pradesh, India Job ID: R0098886 Date Posted: 2025-07-07 Company Name: HITACHI INDIA PVT. LTD Profession (Job Category): Other Job Schedule: Full time Remote: No Job Description: Job Title: Solution Architect Designation : Senior Company: Hitachi Rail GTS India Location: Noida, UP, India Salary: As per Industry Company Overview: Hitachi Rail is right at the forefront of the global mobility sector following the acquisition. The closing strengthens the company's strategic focus on helping current and potential Hitachi Rail and GTS customers through the sustainable mobility transition – the shift of people from private to sustainable public transport, driven by digitalization. Position Overview: We are looking for a Solution Architect that will be responsible for translating business requirements into technical solutions, ensuring the architecture is scalable, secure, and aligned with enterprise standards. Solution Architect will play a crucial role in defining the architecture and technical direction of the existing system. you will be responsible for the design, implementation, and deployment of solutions that integrate with transit infrastructure, ensuring seamless fare collection, real-time transaction processing, and enhanced user experiences. You will collaborate with development teams, stakeholders, and external partners to create scalable, secure, and highly available software solutions. Job Roles & Responsibilities: Architectural Design : Develop architectural documentation such as solution blueprints, high-level designs, and integration diagrams. Lead the design of the system's architecture, ensuring scalability, security, and high availability. Ensure the architecture aligns with the company's strategic goals and future vision for public transit technologies. Technology Strategy : Select the appropriate technology stack and tools to meet both functional and non-functional requirements, considering performance, cost, and long-term sustainability. System Integration : Work closely with teams to design and implement the integration of the AFC system with various third-party systems (e.g., payment gateways, backend services, cloud infrastructure). API Design & Management : Define standards for APIs to ensure easy integration with external systems, such as mobile applications, ticketing systems, and payment providers. Security & Compliance : Ensure that the AFC system meets the highest standards of data security, particularly for payment information, and complies with industry regulations (e.g., PCI-DSS, GDPR). Stakeholder Collaboration : Act as the technical lead during project planning and discussions, ensuring the design meets customer and business needs. Technical Leadership : Mentor and guide development teams through best practices in software development and architectural principles. Performance Optimization : Monitor and optimize system performance to ensure the AFC system can handle high volumes of transactions without compromise. Documentation & Quality Assurance : Maintain detailed architecture documentation, including design patterns, data flow, and integration points. Ensure the implementation follows best practices and quality standards. Research & Innovation : Stay up to date with the latest advancements in technology and propose innovative solutions to enhance the AFC system. Skills (Mandatory): DotNet (C#), C/C++, Java, ASP.NET Core (C#), Angular, OAuth2 / OpenID Connect (Authentication & Authorization) JWT (JSON Web Tokens) Spring Cloud, Docker, Kubernetes, Relational Databases (MSSQL) Data Warehousing SOAP/RESTful API Design, Redis (Caching & Pub/Sub) Preferred Skills (Good to have): Python, Android SSL/TLS Encryption OWASP Top 10 (Security Best Practices) Vault (Secret Management) Keycloak (Identity & Access Management) Swagger (API Documentation) NoSQL Databases, GraphQL, gRPC, OpenAPI, Istio, Apache Kafka, RabbitMQ, Consul, DevOps & CI/CD Tools Tools & Technologies: UML (Unified Modeling Language) Lucidchart / Draw.io (Diagramming) PlantUML (Text-based UML generation) C4 Model (Software architecture model), Enterprise Architect (Modeling), Apache Hadoop / Spark (Big Data), Elasticsearch (Search Engine), Apache Kafka (Stream Processing), TensorFlow / PyTorch (Machine Learning/AI) Education: Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Experience Required: 12+ years of experience in solution architecture or software design. Proven experience with enterprise architecture frameworks (e.g., TOGAF, Zachman). Strong understanding of cloud platforms (AWS, Azure, or Google Cloud). Experience in system integration, API design, microservices, and SOA. Familiarity with data modeling and database technologies (SQL, NoSQL). Strong communication and stakeholder management skills. Preferred: Certification in cloud architecture (e.g., AWS Certified Solutions Architect, Azure Solutions Architect Expert). Experience with DevOps tools and CI/CD pipelines. Knowledge of security frameworks and compliance standards (e.g., ISO 27001, GDPR). Experience in Agile/Scrum environments. Domain knowledge in [insert industry: e.g., finance, transportation, healthcare]. Soft Skills: Analytical and strategic thinking. Excellent problem-solving abilities. Ability to lead and mentor cross-functional teams. Strong verbal and written communication.
Posted 4 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
At Takeda, we are guided by our purpose of creating better health for people and a brighter future for the world. Every corporate function plays a role in making sure we — as a Takeda team — can discover and deliver life-transforming treatments, guided by our commitment to patients, our people and the planet. People join Takeda because they share in our purpose. And they stay because we’re committed to an inclusive, safe and empowering work environment that offers exceptional experiences and opportunities for everyone to pursue their own ambitions. Job ID R0143634 Date posted 07/07/2025 Location Bengaluru, Karnataka I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’sPrivacy Noticeand Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge. Job Description The Future Begins Here At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet. Bengaluru, the city, which is India’s epicenter of Innovation, has been selected to be home to Takeda’s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement. At Takeda’s ICC we Unite in Diversity Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team. OBJECTIVES/PURPOSE As the Global Platform Manager for our Digital Assessment Management Platform working in the Content Innovation Platforms team, you will own and manage the design, development, enablement, and support of Takeda’s Digital Assessment Management Technologies. In this role, you will drive Takeda’s Data and Digital mission through the development and adoption of Digital Assessment Management capabilities that align to business needs, best practices as well as enterprise architecture principles. Drive innovation and continuous improvement of our Digital Assessment Management capabilities in the form of standing up new platforms, tools, or system integrations according to business demand, prioritized use cases as well as enhancing the existing technology stack. Enable stakeholders (both within and outside of Takeda) to consume Takeda’s Global Digital Assessment Management capabilities. Oversee day-to-day operation of our Digital Assessment Management, including platforms and services, to ensure all components work reliably and securely. Socialize changes to existing and new capabilities, tools, components, and consumption models to stakeholders (both within and outside of Takeda) to drive Takeda’s Digital Assessment Management capabilities adoption and stakeholder satisfaction. Prioritize a global backlog to Digital Assessment Management product enhancements and operational needs and manage a DevOps Platform POD within our delivery center to deliver on the pirotized backlog. ACCOUNTABILITIES Partner with global and regional stakeholders to understand business strategy and design solutions aligned to business needs. Develop and own the Digital Assessment Management capabilities roadmap driven by business demand and strategic imperatives. Own and deliver roadmap initiatives in partnership with internal and external stakeholders through agile delivery teams. Drive alignment of proposed solutions according to enterprise standards as well as alignment with Takeda architectural, security, privacy, and quality standards. Lead business case development process by designing and documenting proposed solutions in alignment with business demand. Provide technical and functional support for Digital Assessment Management capabilities. Partner with DD&T and business stakeholders to enable the consumption of Takeda’s Global Digital Assessment Management capabilities. Develop scalable services and platform operational/ governance processes to ensure smooth day-to-day operations in partnership with IT, business, and other key stakeholders. Act as the SME (architectural consulting, provide insight to industry trends) for all Digital Assessment Management projects and capabilities using technologies like Adobe/Aprimo/Bynder/Veeva Promomats, in collaboration with various stakeholders. Establish and institutionalize regular end-to-end KPI reviews for solution adoption, and continuous solution improvement process. OTHER DIMENSIONS AND ASPECTS Knowledge of Digital Asset Management Platforms such as Adobe, Aprimo, Bynder, OpenText, Veeva Promomats and Veeva Vault. Business Acumen in the content design, creation, automation, sharing, tagging, resuse and approvals with a focus on the Life Sciences industry. Able to engage with all levels of the organization and be proficient at building out clearly defined business requirements during discussions with stakeholders. Able to clearly communicate and foster alignment across all levels of the organization. Build strong cross-functional relationships with team members in the other enterprise functions like System Integrators, Digital Leads, Enterprise Architects. Strong ability to build external partnerships with industry partners and suppliers. Able to generate breakthrough solutions and enable others to do so. Day-to-day decisions regarding design, development, and implementation of innovative content solutions across Takeda globally. Able to present issues and recommend solutions in a succinct manner to Global, Senior and Executive Management on a frequent basis. Recommend and monitor spend for project budgets. EDUCATION, BEHAVIOURAL COMPETENCIES AND SKILLS: Essential Bachelor's degree in Computer Science and/or Digital Marketing related fields or equivalent experience. 5+ years of Aprimo of experience implementing Digital Asset Management solutions at an enterprise level in a regulated environment. Deep experience in Veeva Vault and Promomats Strong “soft skills” and English communication skills. Deep Experience with enterprise end-to-end processes in the content management life cycle Experience of working with global teams and experience of working in international, multi-country, multi-cultural environment. Experience working in a life sciences environment strongly preferred Experience working in an agile environment strongly preferred Nice to have GenAI knowledge / experience Nice to have Adobe Assets knowledge / experience ADDITIONAL INFORMATION International Travel of up to 10% may be required from this position. What Takeda Can Offer You Takeda is certified as a Top Employer, not only in India, but also globally. No investment we make pays greater dividends than taking good care of our people. At Takeda, you take the lead on building and shaping your own career. Joining the ICC in Bangalore will give you access to high-end technology, continuous training and a diverse and inclusive network of colleagues who will support your career growth. Benefits It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are: Competitive Salary + Performance Annual Bonus Flexible work environment, including hybrid working Comprehensive Healthcare Insurance Plans for self, spouse, and children Group Term Life Insurance and Group Accident Insurance programs Health & Wellness programs Employee Assistance Program 3 days of leave every year for Voluntary Service in additional to Humanitarian Leaves Broad Variety of learning platforms Diversity, Equity, and Inclusion Programs Reimbursements – Home Internet & Mobile Phone Employee Referral Program Leaves – Paternity Leave (4 Weeks) , Maternity Leave (up to 26 weeks), Bereavement Leave (5 days) About ICC in Takeda Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day. As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization. #Li-Hybrid Locations IND - Bengaluru Worker Type Employee Worker Sub-Type Regular Time Type Full time
Posted 4 weeks ago
0.0 - 31.0 years
1 - 1 Lacs
RV Nagar, Vellore
On-site
Prepare Daily Route Plan (DRP) for withdrawal and loading, in consultation with Route Leader Pick-up cash based on bank wise indent and denomination Prepare cash receipt and take approval from Route Leader Complete allocated site visits for loading or EOD on daily basis Resolve the FLM (First Level Maintenance) issues within TAT Punch all the required entries in Android device timely and accurately such as - withdrawal, loading, EOD, FLM Punch the counter entry in CBS and submit the same to Route Leader Inform Route Leader in case of any issue and enter the same in Android device Submit the return cash to vault, in cases when loading could not happen due to unavoidable circumstances Prepare CBR (Cash Balance Report) when mandatory as per SLA (e.g. for HDFC bank) and submit to Route Leader Coordinate with the Reports team, as required, for mandatory customer reports
Posted 4 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Responsibilities Link to Apply is at the bottom of the JD. Please read the requirements carefully before applying.About LimeChatWhat are we looking for?We're seeking a skilled Software Development Engineer who excels in creating seamless connections between various systems and applications. If you're passionate about coding, love solving integration challenges, and enjoy working with APIs, you might be the ideal candidate.In this role, you'll collaborate closely with our development and product teams to design and implement robust integration solutions. Your work will ensure that different systems communicate effectively, creating a cohesive and efficient technology ecosystem. Join us to build integration solutions that enhance our technical capabilities and drive business success.ResponsibilitiesOwn client go‑lives: Translate enterprise requirements into working GenAI deployments—scoping, coding, testing, and shipping on aggressive timelines.Build custom integrations: Craft REST/webhook Connectors, Event Pipelines, And Data Syncs That Plug Our Agents Into CRMs, OMS, ERPs, And Bespoke Systems.Engineer GenAI Workflows Orchestrate RAG pipelines, prompt chains, and fine‑tuning jobs, using tools like LangChain and PGVector .Design Deployment Architectures Spin up scalable, secure clusters on AWS/GCP with Docker , Kubernetes , and Terraform/Helm .Instrument & Optimise Track latency, token spend, and error budgets; debug across logs (Kibana), tracing, and databases to keep uptime > 99.9 %.Create playbooks: Document repeatable patterns, SDKs, and IaC modules so future deployments are faster and safer.Partner cross‑functionally: Work with Customer Success, Product, and Core Engineering to shape roadmap and close feedback loops.Must‑haves1–3 years building backend/integration solutions for SaaS or platform products.Proficient in Python or Node.js (FastAPI/Django/Express) and strong in RESTful API design.Solid database chops— SQL (PostgreSQL/MySQL) and NoSQL (MongoDB/Redis); comfortable writing performant queries and schema migrations.Hands‑on with Docker containers, Kubernetes/ECS , and at least one cloud provider ( AWS or GCP ).Experience with Git , code‑reviews, and CI/CD pipelines ( GitLab CI , Jenkins, or GitHub Actions).Debugging ninja: can trace requests across services, logs, and message queues to isolate bottlenecks quickly.Excellent written & verbal communication—able to simplify complex tech for clients and teammates.Nice‑to‑haveFamiliarity with LLM/GenAI tooling (LangChain, LlamaIndex, Haystack) and vector stores (PGVector, Pinecone).Event‑driven architecture experience (Kafka, RabbitMQ, SNS/SQS).Infra‑as‑Code with Terraform or Pulumi ; observability stacks (Grafana, Prometheus).Security & compliance basics (SOC 2, ISO 27001) and secrets management (Vault, AWS KMS).Tech Stack @ LimeChatPython Node.js FastAPI/Django PostgreSQL MongoDB Redis RabbitMQ Docker Kubernetes AWS/GCP Terraform LangChain PGVectorGrowth PathYear 1: Launch 10+ enterprise deployments, build reusable integration SDKs, reduce average go‑live from 4 weeks → 1 week.Year 2+: Lead a Deployment squad, shape platform extensibility, or transition into Product/Platform Engineering.Perks & BenefitsUnlimited PTO & sick leaveSubsidised fitness membershipFree lunch & snacksAnnual company retreatPet‑friendly office 🐾Apply HereDoes this role sound like a good fit? Apply hereChoose “Software Development Engineer (SDE)” in the Dropdown
Posted 4 weeks ago
0 years
5 - 9 Lacs
Bengaluru
On-site
Req ID: 330864 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Senior Dev Ops Engineer to join our team in Bangalore, Karnātaka (IN-KA), India (IN). "Job Duties: -DEVOps Exp in Establishing and managing CI/CD pipelines to automate the build, test, and deployment processes. Exp in Provision and manage infrastructure resources in the cloud using tools like Terraform. Exp in Azure Databricks, Azure DevOps tools, Terraform / Azure Resource Manager , Containerization and Orchestration with Docker and Kubernetes. Version control Exp - Git or Azure Repos Scripting automation - Azure CLI/Powershell Must have: Proficiency in Cloud Technologies Azure, Azure Databricks, ADF, CI/CD pipelines, terraform, Hashicorp Vault, Github, Git Preferred: Containerization and Orchestration with Docker and Kubernetes,IAM, RBAC, OAuth, Change Managment, SSL certificates Knowledge of security best practices and compliance frameworks like GDPR or HIPAA. Minimum Skills Required: -DEVOps Exp in Establishing and managing CI/CD pipelines to automate the build, test, and deployment processes. Exp in Provision and manage infrastructure resources in the cloud using tools like Terraform. Exp in Azure Databricks, Azure DevOps tools, Terraform / Azure Resource Manager , Containerization and Orchestration with Docker and Kubernetes. Version control Exp - Git or Azure Repos Scripting automation - Azure CLI/Powershell Must have: Proficiency in Cloud Technologies Azure, Azure Databricks, ADF, CI/CD pipelines, terraform, Hashicorp Vault, Github, Git Preferred: Containerization and Orchestration with Docker and Kubernetes,IAM, RBAC, OAuth, Change Managment, SSL certificates Knowledge of security best practices and compliance frameworks like GDPR or HIPAA." About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 4 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Privileged Account Management (PAM) Implementation Specialist – CyberArk Location: Chennai, India Employment Type: Full-Time Experience Level: 5+ Years About the Role: We are looking for a Privileged Account Management (PAM) Implementation Specialist with expertise in CyberArk PAM solutions (Vault, PVWA, CPM, PSM, and PSM) to join our dynamic IT security team. In this role, you will be responsible for leading the implementation, configuration, and support of CyberArk PAM solutions to ensure optimal security for privileged accounts, credentials, and access management across enterprise environments. Key Responsibilities: CyberArk PAM Implementation: Lead and manage end-to-end implementation of CyberArk PAM solutions (Vault, PVWA, CPM, PSM, and PSM) for clients and internal systems. Solution Design & Architecture: Collaborate with cross-functional teams to design scalable and robust CyberArk PAM architectures tailored to client requirements, ensuring integration with existing security frameworks. Deployment & Configuration: Oversee the deployment, configuration, and fine-tuning of CyberArk components , ensuring seamless integration with infrastructure, identity management systems, and other security tools. Security Risk Mitigation: Ensure adherence to best practices in managing privileged access, mitigate risks related to privileged accounts, and implement security controls to protect sensitive data. Troubleshooting & Support: Provide troubleshooting support, resolution of implementation issues, and guidance on CyberArk PAM functionalities to internal teams and clients. Documentation & Reporting: Prepare detailed project documentation , including architecture diagrams, implementation plans, and post-implementation support plans. Provide regular reports on progress and outcomes. Continuous Improvement: Stay updated on CyberArk product updates and evolving security threats, recommending improvements to strengthen security posture and efficiency. Training & Knowledge Transfer: Conduct training sessions and knowledge transfer workshops for clients and internal teams, enabling effective utilization of CyberArk PAM tools . Required Skills & Qualifications: Minimum 5 years of experience in implementing and managing CyberArk PAM solutions (Vault, PVWA, CPM, PSM, PSM). In-depth knowledge of CyberArk architecture , installation, configuration, and deployment processes. Hands-on experience with Windows, Linux , and Unix environments , and integration of CyberArk with these systems. Familiarity with IAM (Identity and Access Management) and other security solutions. Strong understanding of privileged account security best practices and access management frameworks. Ability to troubleshoot and resolve complex CyberArk issues . CyberArk Certified Delivery Engineer (CDE) or CyberArk Certified Trustee certification is highly desirable. Excellent communication skills with the ability to explain technical concepts to both technical and non-technical stakeholders. Ability to work independently and collaborate effectively with a global team. Preferred Skills: Experience with IAM solutions such as Okta , Microsoft Identity , or SailPoint . Knowledge of scripting languages (e.g., PowerShell, Python ) for automation tasks. Experience with SIEM solutions like Splunk or QRadar for security event monitoring and analysis. Why Join Us? Growth Opportunities: Be part of a rapidly growing security team and take your career to the next level with exposure to cutting-edge technologies. Collaborative Environment: Work in a diverse and inclusive environment with opportunities for continuous learning. Competitive Benefits: We offer a comprehensive benefits package, including health insurance, paid time off, and flexible working arrangements.
Posted 4 weeks ago
0.0 - 10.0 years
0 Lacs
Vadodara, Gujarat
On-site
Senior Design Engineer GEA is one of the largest suppliers for the food and beverage processing industry and a wide range of other process industries. Approximately 18,000 employees in more than 60 countries contribute significantly to GEA’s success – come and join them! We offer interesting and challenging tasks, a positive working environment in international teams and opportunities for personal development and growth in a global company. Why join GEA Job information Reference Number JR-0034026 Job function Engineering Position type Full time Site Block No 8,, P.O. Dumad, Savli Road, Vadodara- 391740 Gujarat Your responsibilities and tasks: General Work cooperatively with the team of engineers and technical staff in the GEA Global Entities in the preparation of quotations and execution of projects as and when required. Develop the mechanical design according to given sketches, P&ID’s, layout and specifications under the direction of engineering team. Create drawings from basic 2D up to complete 3D modelling required to complete a project, developing a layout of the system, components and parts. This includes design verification and validation. Participate in the review of plant, equipment and structures as required. Schedule and control own as well as assigned resource’s workflow to meet deadline commitments. Undertake control of drafting for small to medium sized projects as required Review and submit the drawings prepared by resources against requirements. Train/mentor new team members. Be on the forefront of 3D plant design using the latest proven software and hardware technologies, bringing the most modern GEA image possible to our clients, reducing costs, delivery time and risk associated to lack of definition. Work cooperatively with the Company administrative staff. High proficiency in 3D software program, preferable Autodesk Inventor. Proficiency in AutoCAD mechanical is a must. Knowledge about working with Autodesk Vault or Vault PRO (Local regions define) Basic understanding about piping and piping specifications. Administration Prepare weekly timesheets and project reports. Assist in other parts of the Company's business as required to ensure the overall effectiveness of the Company. Your profile and qualifications: Overview Senior Design Engineer must be self-motivated, detail oriented, possess good English language communication skills, both verbal and written. Must possess good interpersonal skills and work well in a team setting as well as independently as required. Educational Qualification: Diploma/Degree in Mechanical Engineering Years and Type of Experience: 7-10 years of experience in Autodesk Inventor Professional, AutoCAD (Machine Design, PID, Skid Engineering, Redesign and optimization experience preferred) Specific Skills / Knowledge: Autodesk Inventor professional, Autodesk AutoCAD, Autodesk Vault Pro International Code and Standards (FDA, 3A, ANSI, DIN< ISO, ASTM, ASME) Understanding of process documents like Process Flow diagram, P&ID and Equipment list, overall plot plan, civil building drawings. Working Knowledge of Frame / Stress Analysis in Autodesk Inventor.
Posted 4 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Required Skills Proficiency in multiple programming languages ideally Python Proficiency in at least one cluster computing frameworks (preferably Spark, alternatively Flink or Storm) Proficiency in at least one cloud data lakehouse platforms (preferably AWS data lake services or Databricks, alternatively Hadoop), at least one relational data stores (Postgres, Oracle or similar) and at least one NOSQL data stores (Cassandra, Dynamo, MongoDB or similar) Proficiency in at least one scheduling/orchestration tools (preferably Airflow, alternatively AWS Step Functions or similar) Proficiency with data structures, data serialization formats (JSON, AVRO, Protobuf, or similar), big-data storage formats (Parquet, Iceberg, or similar), data processing methodologies (batch, micro-batching, and stream), one or more data modelling techniques (Dimensional, Data Vault, Kimball, Inmon, etc. ), Agile methodology (develop PI plans and roadmaps), TDD (or BDD) and CI/CD tools (Jenkins, Git,) Strong organizational, problem-solving and critical thinking skills; Strong documentation skills Preferred Skills Proficiency in IaC (preferably Terraform, alternatively AWS cloud formation) (ref:hirist.tech)
Posted 4 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent thats bold, industrious, and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration, and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com. Looking to jump-start your career? We understand how important the first few years of your career are, which create the foundation of your entire professional journey. At Indegene, we promise you a differentiated career experience. You will not only work at the exciting intersection of healthcare and technology but also will be mentored by some of the most brilliant minds in the industry. We are offering a global fast-track career where you can grow along with Indegenes high-speed growth. We are purpose-driven. We enable healthcare organizations to be future-ready and our customer obsession is our driving force. We ensure that our customers achieve what they truly want. We are bold in our actions, nimble in our decision-making, and industrious in the way we work. Must Have Role: DAM Librarian Description: You will be responsible for: Coordinating with Brand teams / agencies / internal stakeholders in terms of completeness of metadata requirement / source files etc. Conduct regular cleanup/hygiene activities. Enabling Brand Portals Creating portals as per requirements / guiding Portal administrators for managing the Portals effectively. Perform quality checks on all proposed content for the Content Hub. Upload the Content to Platform (including in-design / source files). Manage Component Library, Claims Library and Digital Rights Management/ Taxonomy. Prepare Content performance dashboards and reports as requested by Client; and provide Content Hub and/or VEEVA Vault portal maintenance. Creating Reporting Dashboards and maintaining them. Desired Profile (Key Skills) Proficient in understanding metadata and taxonomy structures of digital assets. Good understanding of end-to-end Digital asset management lifecycle. Good understanding of Business workflows and Asset management standard practices. Veeva, PromoMats, DAM & Strong written and verbal communication skills. Good to have EQUAL OPPORTUNITY Indegene is proud to be an Equal Employment Employer and is committed to the culture of Inclusion and Diversity. We do not discriminate on the basis of race, religion, sex, colour, age, national origin, pregnancy, sexual orientation, physical ability, or any other characteristics. All employment decisions, from hiring to separation, will be based on business requirements, the candidates merit and qualification. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristics. Locations Bangalore, KA, IN
Posted 4 weeks ago
0 years
0 Lacs
Balangir, Odisha, India
On-site
Company Description Vault Agritech Private Limited, established in December 2017, focuses on the production of processed foods from agricultural produce using the latest technology. The company's first project is ‘Pasta’ with state-of-the-art Italian technology provided by Axor Ocrim Pasta machinery, a pioneer in pasta machinery manufacturing. Located in Balangir, Odisha, the plant benefits from a climate favorable for production and a strategic location for PAN India distribution and exports. Role Description This is a full-time, on-site role for a Production Specialist located in Balangir. The Production Specialist will be responsible for overseeing daily production operations, ensuring quality control, managing production planning, training staff, and maintaining effective communication within the team and with other departments. Qualifications Production Planning and Production Management skills Quality Control expertise Strong Communication skills Experience in Training staff Ability to work effectively on-site and as part of a team Relevant experience in the food processing industry is a plus Bachelor's degree in Engineering, Food Technology, or related field
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France