Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You have 4 to 8 years of experience in Classic Autosar SW-C development with strong Embedded C knowledge. You should be proficient in Vector stack, RTE knowledge, CNAoE configuration, and scripting, as well as trace-32. Additionally, you are required to have good experience with expert-level knowledge in Python. Hands-on experience using Pandas and Pickle is essential. Moreover, familiarity with Element tree Parsing (XML parsing) and Jinja is highly valued. This role is located in Bangalore.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Join Fortinet, a cybersecurity pioneer with over two decades of excellence, as you continue to shape the future of cybersecurity and redefine the intersection of networking and security. At Fortinet, the mission is to safeguard people, devices, and data everywhere. Currently seeking a dynamic Professional Services Consultant to contribute to the success of the rapidly growing business. As a Professional Services Consultant, you will act to provide exceptional customer service for loyal customers. The ideal candidate is energetic and passionate about working for Fortinet and supporting customers while developing good and respected relationships with internal and external customers. This is a great opportunity to excel in an innovative, fast-paced environment while expanding knowledge and developing skills in network security. Responsibilities: - Develop a clear overall understanding of customer engagement, objectives, project scope, business, and technical requirements, and DevOps customization requirements. - Participate in design or scoping meetings, providing input related to the development of statements of work based on accurate analysis of customers" requirements. - Drive and take an active role in various phases of a typical project delivery such as design, development, testing and validation, implementation, and customization. - Develop project-related documents like architecture documents and operations guides. - Assist customers through the rollout of the proposed solution and conduct knowledge transfer sessions for customer staff. - Complete and submit internal required administrative tasks like timesheets and project reports. - Continuously work on developing knowledge and skills to remain proficient with relevant skills required for SOAR and DevOps activities, Fortinet technologies, products, services, and security. Requirements: - Development skills and experience following industry-standard development methodologies. - Experience in customer-facing roles and very good presentation and technical documentation skills. - Ability to adapt seamlessly to shifting priorities, demands, and timelines with flexible working hours. - Positive customer service attitude with very good soft skills. - Strong commitment and self-driven individual with the ability to work independently and collaborate with cross-functional teams. - Ability and desire to learn new languages and technologies, proficient in exploring and integrating new technologies, programming languages, or frameworks. - Working understanding of common network topologies and hardware, fundamental knowledge of common Internet protocols and security threats. - Hands-on experience in building, administering, and maintaining servers, strong experience in software programming and development including scripting. - Experience in API integration, HTTP protocol, mail systems, and Linux systems. - Experience in industry-standard common continuous integration/automation tools and frameworks. - Development of projects related to network/infrastructure and automation, experience with at least one major Cloud infrastructure. About Our Team: Our team culture emphasizes collaboration, continuous improvement, customer-centricity, innovation, and accountability. By embedding these values into our ethos and culture, we create a dynamic and supportive environment that drives excellence and innovation while maintaining a strong focus on customers" needs and satisfaction. Why Join Us: Fortinet encourages candidates from all backgrounds and identities to apply and offers a supportive work environment and a competitive Total Rewards package to support overall health and financial well-being. Embark on a challenging, enjoyable, and rewarding career journey with Fortinet and join in bringing solutions that make a meaningful and lasting impact to customers around the globe.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Product Manager at our company located in Trivandrum, you will play a pivotal role in shaping the future of our products. Your primary responsibility will be to define the product vision, strategy, and roadmap in alignment with customer needs and business objectives. Collaborating with cross-functional teams including engineering, design, marketing, and sales, you will oversee the entire product lifecycle from conceptualization to launch and post-launch activities. Your role will also involve conducting market research, gathering user feedback, and continuously monitoring product performance to identify areas for improvement. To excel in this role, you should possess a Bachelor's degree in Business, Engineering, Computer Science, or related field, with at least 5 years of experience in product management or similar roles. Proficiency in Agile development methodologies, a track record of successfully delivering digital products, and exceptional communication and leadership skills are essential. An analytical mindset, coupled with the ability to leverage data and insights for making informed product decisions, will be critical for your success. Preferred skills for this role include familiarity with product management tools such as Jira, Trello, and Confluence, an understanding of UI/UX design principles, and experience with programming languages like Python, C, Java, and C++. Exposure to concepts such as Embedded Systems, RTOS, IoT platforms, SQL, NoSQL databases, cloud computing architecture, Full Stack Web Development, Cybersecurity, and Machine Learning will be advantageous. If you are someone who thrives in a fast-paced, collaborative environment and is passionate about driving product innovation, we encourage you to apply for this position. This is a full-time, permanent role based in Trivandrum, Kerala, requiring in-person work. We are looking for individuals who can reliably commute or are willing to relocate to Trivandrum before starting work. We are excited about the possibility of you joining our team and contributing to our mission. Kindly specify your availability to start with us in days.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Web App Developer at our company, you will be responsible for designing the overall architecture of web applications and implementing a robust set of services and APIs to power them. Your role will involve building reusable code and libraries for future use, optimizing applications for maximum speed and scalability, and implementing security and data protection measures. You will also be tasked with translating UI/UX wireframes into visual elements and integrating the front-end and back-end aspects of web applications. To excel in this role, you should possess excellent verbal and written communication skills, be able to work independently while also being a team player, and have proficient knowledge of at least one back-end programming language such as PHP, Node JS, Python, Ruby, Java, .NET, or JavaScript. Understanding the differences between various delivery platforms and optimizing output accordingly will be crucial, as will be familiarity with server-side templating languages and CSS preprocessors. Experience in building responsive web applications, handling data migration and scripting, managing hosting environments, and implementing automated testing platforms are all desirable skills for this position. Additionally, you should have a good understanding of advanced JavaScript libraries and frameworks, client-side scripting, and code versioning tools. Basic knowledge of image authoring tools and familiarity with development aiding tools will also be beneficial. If you believe you are a suitable candidate for this position and meet the qualifications mentioned above, please send us your updated resume to hr@mobiuso.com. We will review your application and contact you if we find that you are a good potential match for the role.,
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Pune
Remote
Work Hours: 4:30 PM to 1:30 AM IST Experience Required: 8+ Years Role Summary: We are seeking an experienced DBT Engineer with good experience in Azure Cloud , DBT (Data Build Tool) , and Snowflake . The ideal candidate will have a good background in building scalable data pipelines, designing efficient data models, and enabling advanced analytics. Key Responsibilities: Design and maintain scalable ETL pipelines with DBT and SQL , ensuring high performance and reliability. Develop advanced DBT workflows using artifact files, graph variables, and complex macros leveraging run_query. Implement multi-repo or mesh DBT setups to support scalable and collaborative workflows. Utilize DBT Cloud features such as documentation, Explorer, CLI, and orchestration to optimize data processes. Build and manage CI/CD pipelines to automate and enhance data deployment processes. Write and optimize complex SQL queries to transform large datasets and ensure data accuracy. Collaborate with cross-functional teams to integrate data solutions into existing workflows. Troubleshoot and resolve errors in pipelines caused by DBT code or transformation issues. Adhere to best practices for version control using git flow workflows to manage and deploy code changes. Ensure code quality and maintainability by implementing code linting and conducting code reviews. Required Skills and Qualifications: 8+ years of experience in data engineering with a strong focus on ETL processes and data pipeline management. MUST have experience in Azure cloud, working on Data warehousing involving ADF, Azure Data Lake, DBT and Snowflake At least 4+ years of hands-on experience with DBT . Advanced proficiency in SQL and data modeling techniques . Deep understanding of DBT, including artifact files, graph usage, and MetricFlow. Proficiency in DBT Cloud features like CLI, orchestration, and documentation. Strong skills in Python for scripting and automation tasks. Familiarity with CI/CD pipeline tools and workflows. Hands-on experience with git flow workflows for version control. Solid troubleshooting skills to resolve pipeline errors efficiently. Knowledge of pipeline orchestration and automation. Soft Skills: A proactive problem-solver with excellent attention to detail. Strong communication and collaboration skills to work with cross-functional teams. A positive attitude and ownership mindset to drive projects to completion.
Posted 2 weeks ago
5.0 - 8.0 years
4 - 7 Lacs
Bengaluru
Work from Office
About The Role Skill required: Delivery - Marketing Analytics and Reporting Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIAnalytical processes and technologies applied to marketing-related data to help businesses understand and deliver relevant experiences for their audiences, understand their competition, measure and optimize marketing campaigns, and optimize their return on investment. What are we looking for Data Analytics - with a specialization in the marketing domain*Domain Specific skills* Familiarity with ad tech and B2B sales*Technical Skills* Proficiency in SQL and Python Experience in efficiently building, publishing & maintaining robust data models & warehouses for self-ser querying, advanced data science & ML analytic purposes Experience in conducting ETL / ELT with very large and complicated datasets and handling DAG data dependencies. Strong proficiency with SQL dialects on distributed or data lake style systems (Presto, BigQuery, Spark/Hive SQL, etc.), including SQL-based experience in nested data structure manipulation, windowing functions, query optimization, data partitioning techniques, etc. Knowledge of Google BigQuery optimization is a plus. Experience in schema design and data modeling strategies (e.g. dimensional modeling, data vault, etc.) Significant experience with dbt (or similar tools), Spark-based (or similar) data pipelines General knowledge of Jinja templating in Python. Hands-on experience with cloud provider integration and automation via CLIs and APIs*Soft Skills* Ability to work well in a team Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualification Any Graduation
Posted 3 weeks ago
1.0 - 4.0 years
2 - 5 Lacs
Gurugram
Work from Office
LocationBangalore/Hyderabad/Pune Experience level8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBTautomated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions
Posted 3 weeks ago
1.0 - 3.0 years
3 - 5 Lacs
Thiruvananthapuram
Work from Office
Citizen Digital Foundation (CDF) is inviting applications from young, motivated tech-savvy individuals to join as a Tech Fellow to provide overall tech support for the organisation The fellow will play a key role in the implementation/migration of technologies and various software applications for the organisation Individuals committed to promoting responsible and inclusive use of technology, in an NGO setting are preferred Location: Trivandrum (with occasional travel to other parts of the state)Duration: 6 monthsCompensation: Upto 20K depending on candidates experience Key Responsibilities:Assist in implementation of a suitable ERP system to streamline various internal processes Create a platform for CDF knowledge management and maintain the resource library Manage/customise backend systems and ensure the smooth functioning of various digital platforms used by the organisation Attend systems and software-related troubleshooting across teams Support the integration and maintenance of various digital tools used by CDF Document key processes and provide basic tech training to team members Assist in streamlining data integrity, system security, and best digital practices across all applications in use Revamp the existing CDF website/ develop a new one for the organisation Eligibility:B Tech in Computer Science/Information Technology or an MCA is preferred However, individuals with a strong passion for coding and a portfolio of relevant work are also encouraged to apply Experience in website development (both front-end and back-end), including familiarity with platforms such as WordPress and Wix Proficiency in Python, JavaScript, and Jinja templating Strong understanding of database management An interest in exploring and customising open-source software to suit the organisations needs Familiarity with ERP implementation and configuration is desirable Strong problem-solving skills and a user-first mindset Ability to work independently and communicate effectively with non-technical teams Prior experience working with non-profits or purpose-driven organisations is a plus
Posted 3 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Bengaluru
Work from Office
Location: Bangalore/Hyderabad/Pune Experience level: 8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBT: automated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions
Posted 3 weeks ago
2.0 - 5.0 years
3 - 7 Lacs
Kolkata
Work from Office
2- 5 years of hands-on experience in Python development with a focus on AI/ML. .Solid understanding of machine learning algorithms . Experience with ML libraries such as Scikit-learn, TensorFlow, PyTorch, Keras, or XGBoost. Annual bonus Provident fund Health insurance
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Functional Testing Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Manual Testing Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on at least 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basis Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Hyderabad
Work from Office
We are seeking an experienced and dynamic Senior Database Administrator and Test Data Management Specialist to join our IT team. The ideal candidate will possess robust database administration skills across multiple database types and demonstrate proficiency in utilizing the Delphix platform. In addition, the candidate should have a solid background in Test Data Management (TDM) projects. This role requires a candidate with a minimum of 6-12 years of relevant IT experience. Key Responsibilities: Database Administration: Manage and administer databases such as Oracle, MS SQL, Mainframes, or other database types. Ensure database performance, availability, and security in multi-database environments. Conduct database tuning and optimization, backup and recovery, and troubleshooting issues. Delphix Platform Management: Utilize Delphix virtualization and masking platforms to manage and secure data efficiently. Implement data masking solutions to protect sensitive data in non-production environments. Leverage Delphix for data provisioning, cloning, and refresh operations to support agile development cycles. Test Data Management: Lead and manage TDM projects to provide consistent, reliable, and secure test data for various environments. Develop and implement test data strategies, plans, and scripts to meet testing requirements. Collaborate with QA and development teams to ensure seamless integration of test data processes. Collaboration and Documentation: Work closely with cross-functional teams including DevOps, QA, and developers to support database and data management needs. Document processes, configurations, and best practices for database administration and TDM. Required Skills and Qualifications: Strong database administration experience with two or more types of databases (e.g., Oracle, MS SQL, Mainframes). Hands-on experience with Delphix virtualization and masking platforms. Proven experience in managing Test Data Management projects. 6-12 years of relevant experience in the IT industry. Excellent problem-solving skills and ability to troubleshoot database issues. Strong communication and collaboration skills. Good to Have Skills: Experience working with Jenkins pipelines for CI/CD processes. Coding experience in languages such as Python, Jinja, etc., is a plus. Familiarity with Agile methodologies and DevOps practices. Experience writing shell scripts or SQL scripts.
Posted 1 month ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Manual Testing Good to have skills : Data EngineeringMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Your role will require you to navigate complex data environments, providing insights and recommendations that drive effective data management and governance practices. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantice layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 1 month ago
0.0 - 5.0 years
5 - 9 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Write effective, scalable code Develop back-end components to improve responsiveness and overall performance Integrate user-facing elements into applications Test and debug programs Improve functionality of existing systems Required Candidate profile Expertise in at least one popular Python framework (like Django, Flask or Pyramid) Familiarity with front-end technologies (like JavaScript and HTML5) Team spirit Good problem-solving skills Perks and benefits Free meals and snacks. Bonus. Vision insurance.
Posted 1 month ago
0.0 - 5.0 years
5 - 9 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Write effective, scalable code Develop back-end components to improve responsiveness and overall performance Integrate user-facing elements into applications Test and debug programs Improve functionality of existing systems Required Candidate profile Expertise in at least one popular Python framework (like Django, Flask or Pyramid) Familiarity with front-end technologies (like JavaScript and HTML5) Team spirit Good problem-solving skills Perks and benefits Free meals and snacks. Bonus. Vision insurance.
Posted 1 month ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Skill required: Delivery - Marketing Analytics and Reporting Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years What would you do? Data & AIAnalytical processes and technologies applied to marketing-related data to help businesses understand and deliver relevant experiences for their audiences, understand their competition, measure and optimize marketing campaigns, and optimize their return on investment. What are we looking for? Data Analytics - with a specialization in the marketing domain*Domain Specific skills* Familiarity with ad tech and B2B sales*Technical Skills* Proficiency in SQL and Python Experience in efficiently building, publishing & maintaining robust data models & warehouses for self-ser querying, advanced data science & ML analytic purposes Experience in conducting ETL / ELT with very large and complicated datasets and handling DAG data dependencies. Strong proficiency with SQL dialects on distributed or data lake style systems (Presto, BigQuery, Spark/Hive SQL, etc.), including SQL-based experience in nested data structure manipulation, windowing functions, query optimization, data partitioning techniques, etc. Knowledge of Google BigQuery optimization is a plus. Experience in schema design and data modeling strategies (e.g. dimensional modeling, data vault, etc.) Significant experience with dbt (or similar tools), Spark-based (or similar) data pipelines General knowledge of Jinja templating in Python. Hands-on experience with cloud provider integration and automation via CLIs and APIs*Soft Skills* Ability to work well in a team Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualifications Any Graduation
Posted 1 month ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : Data EngineeringMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with various teams to ensure that the integration between systems and data models is seamless and efficient. You will engage in discussions to refine the architecture and design, ensuring that the data platform meets the needs of the organization while adhering to best practices. Additionally, you will be involved in problem-solving sessions, where you will provide insights and solutions to enhance the overall data strategy. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on at least 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basis Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : Data EngineeringMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Your role will require you to navigate complex data environments, providing insights and recommendations that drive effective data management and governance practices. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantice layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on at least 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basis Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 2 months ago
3.0 - 8.0 years
5 - 10 Lacs
Mumbai, Bengaluru
Work from Office
We are looking for a Python/Django developer who is well versed in Python language as well as in use of Django framework. Knowledge of other python web frameworks is an advantage. Skills Needed Expert in Python (3+ years experience) Proficient in Django Development Framework Good understanding of REST Architecture Proficiency in writing regular expressions Familiarity with ORM libraries Hands-on experience with application deployment Knowledge of user authentication and authorization across multiple systems/environments Understanding of fundamental design principles behind scalable & distributed applications Ability to design a modular, maintainable system for moderately complex problems (multiple interactions/cases). Able to integrate multiple data sources and databases into one system Understanding of multi-threaded/multi-process architecture (e.g., Celery) Good understanding of server-side templating languages (Jinja) Basic front-end skills: JavaScript, HTML5, and CSS3 Able to create database schemas that represent and support business processes Strong unit testing and debugging skills Solid understanding of Git for version control Other Technologies Experience RabbitMQ (Message Broker Systems) Elasticsearch Databases: MySQL, PostgreSQL, MongoDB Server Tools: Nginx, Supervisor
Posted 2 months ago
3.0 - 6.0 years
10 - 15 Lacs
Bengaluru
Work from Office
We are looking for a Python/Django developer who is well versed in Python language as well as in use of Django framework. Knowledge of other python web frameworks is an advantage. Skills Needed Expert in Python (3+ years experience) Proficient in Django Development Framework Good understanding of REST Architecture Proficiency in writing regular expressions Familiarity with ORM libraries Hands-on experience with application deployment Knowledge of user authentication and authorization across multiple systems/environments Understanding of fundamental design principles behind scalable & distributed applications Ability to design a modular, maintainable system for moderately complex problems (multiple interactions/cases). Able to integrate multiple data sources and databases into one system Understanding of multi-threaded/multi-process architecture (e.g., Celery) Good understanding of server-side templating languages (Jinja) Basic front-end skills: JavaScript, HTML5, and CSS3 Able to create database schemas that represent and support business processes Strong unit testing and debugging skills Solid understanding of Git for version control Other Technologies Experience RabbitMQ (Message Broker Systems) Elasticsearch Databases: MySQL, PostgreSQL, MongoDB Server Tools: Nginx, Supervisor
Posted 2 months ago
5.0 - 8.0 years
7 - 11 Lacs
Gurugram
Work from Office
Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough