Bridgenext Technologies

11 Job openings at Bridgenext Technologies
Lead Salesforce Developer kolkata,mumbai,new delhi,hyderabad,pune,chennai,bengaluru 5 - 10 years INR 9.0 - 13.0 Lacs P.A. Work from Office Full Time

The Bridgenext Lead Salesforce Developer plays a critical role in designing, building, and maintaining robust, scalable Salesforce solutions across multiple Managed Services clients. This role combines hands-on development, DevOps, technical architecture, and team leadership. You ll guide both technical implementation and team collaboration to deliver high-quality outcomes for our clients. As a Lead Developer, you ll manage a portfolio of accounts, support client communication, set development standards, and assist with long-term product strategy and evolution. Duties & Responsibilities Work both independently and collaboratively with your team to deliver solutions that meet client expectations and acceptance criteria while serving as the technical lead across the Managed Services portfolio. Design scalable Salesforce solutions using declarative and programmatic tools. Support and coach developers by reviewing PRs, offering mentorship, and providing technical guidance. Manage multiple technical requests and timelines across clients; help prioritize and communicate expectations clearly. Configure and maintain DevOps pipelines (e.g., CircleCI, GitHub Actions) to support multi-org workflows and ISV pipelines. Create and manage Salesforce packages (1GP and 2GP); support versioning, deployment, and packaging strategy. Lead Salesforce Security Review preparations, including static code analysis, documentation, and false positive management. Work collaboratively with Lead Consultants to guide product evolution and recommend best practices. Create lightweight wireframes or technical diagrams to support development planning. Own code quality across projects; ensure compliance with internal coding standards and Salesforce best practices. Create technical documentation for knowledge transfer and onboarding; conduct KT sessions as needed. Stay up-to-date on Salesforce platform changes, tools, and architectural patterns. Experience Required 5+ years Salesforce development experience, including Apex, LWC, and API integrations. Experience designing and implementing scalable architectures across Sales Cloud, Service Cloud, and Experience Cloud. Experience with Salesforce ISV Partner support and packaging (1GP and 2GP). Proficiency in DevOps tools and CI/CD pipeline configuration (e.g., GitHub, Bitbucket, CircleCI). Experience leading or mentoring other developers within a team environment. Comfortable managing timelines, stakeholder communication, and supporting multiple projects simultaneously. Understanding of Salesforce Security Review process, remediation strategies, and submission requirements. Strong verbal and written communication skills able to explain technical concepts to business users. Salesforce Platform Dev I certification required; Dev II or JavaScript Dev I preferred.

Senior Snowflake Data Warehouse Developer pune,bengaluru 6 - 11 years INR 8.0 - 13.0 Lacs P.A. Work from Office Full Time

Position Description We are looking for looking for a looking to add a talented Cloud Architect to our team. This is a hands-on role, responsible for driving the architecture, design, and implementation of Snowflake for our clients. We are looking for someone who cares about the quality and who is passionate about providing the best solution to meet the client s needs and anticipate their future needs based on an understanding of the market. Must Have Skills: 6+ years of experience in designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse. Expertise in Snowflake data modeling, ELT using snow pipe, implementing stored procedures and standard DWH and ETL concepts. Experience with data security and data access controls and design in Snowflake. Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning. Experience in managing Snowflake environments including user roles, security, and access control. Experience in handling backup, failover, and disaster recovery strategies. Experience in maintaining Snowflake repository and version control for production and test environments. Good to have Snowflake cloud data warehouse Architect certification. Good to have experience in building solutions on Snowflake using combination of Python, PySpark with SnowSQL. Good to have AWS Cloud experience. Professional Skills: Solid written, verbal, and presentation communication skills Strong team and individual player Maintains composure during all types of situations and is collaborative by nature High standards of professionalism, consistently producing high-quality results Self-sufficient, independent requiring very little supervision or intervention Demonstrate flexibility and openness to bring creative solutions to address issues

Tableau - Senior Developer pune 4 - 9 years INR 6.0 - 11.0 Lacs P.A. Work from Office Full Time

Tableau Senior Developer Job ID: Tab-ETP-Rem-1121 Location: Remote,Pune We are seeking a highly skilled and experienced Senior Tableau Developer to join our dynamic team. As a Senior Tableau Developer, you will play a pivotal role in designing, developing, and implementing data visualizations and reports using Tableau. You will work closely with stakeholders to understand business requirements, translate them into actionable data insights, and deliver interactive, high-performance dashboards and reports. Key Responsibilities: Dashboard & Report Development: Design, develop, and maintain interactive Tableau dashboards and reports that provide actionable insights for business stakeholders. Data Integration: Integrate data from multiple sources, including relational databases, cloud-based platforms, and other third-party systems into Tableau for accurate and efficient visualizations. Optimization: Optimize Tableau workbooks, dashboards, and queries for performance, ensuring fast loading times and an intuitive user experience. Collaboration with Stakeholders: Engage with business users, data analysts, and other stakeholders to gather requirements, understand business needs, and translate them into effective Tableau visualizations. Data Modeling: Work with the data engineering team to develop data models, perform data validation, and ensure the data pipeline is running smoothly for Tableau integration. Automation & Scripting: Automate Tableau report generation, data extraction, and scheduling through Tableau Server/Online and/or custom scripts. Best Practices & Documentation: Establish and enforce best practices for dashboard design, data visualization, and performance optimization. Maintain thorough documentation of Tableau reports, workbooks, and data sources. Troubleshooting & Support: Provide ongoing support and troubleshooting of Tableau dashboards and reports, resolving issues in a timely and efficient manner. Mentorship: Mentor junior developers and Tableau users, sharing knowledge on best practices, data visualization techniques, and troubleshooting. Training & Development: Conduct training sessions for internal teams on Tableau usage, techniques, and best practices. Required Skills and Qualifications: Experience: 4+ years of hands-on experience in Tableau development and data visualization. Strong background in working with large datasets and integrating data from multiple sources (SQL, APIs, etc.). Technical Skills: Expertise in Tableau Desktop, Tableau Server, Tableau Prep, and Tableau Online. Proficient in SQL for data querying and manipulation. Strong understanding of data warehousing concepts, ETL processes, and data modeling. Experience in writing complex calculations, table calculations, and Level of Detail (LOD) expressions in Tableau. Business Intelligence Expertise: Proven ability to understand and interpret business requirements to deliver insightful data visualizations. Knowledge of BI concepts like KPIs, metrics, and reporting best practices. Performance Tuning: Strong skills in optimizing Tableau performance (query optimization, efficient use of extracts, minimizing dashboard load times). Soft Skills: Excellent communication and interpersonal skills. Ability to collaborate effectively with business users and technical teams. Strong analytical and problem-solving abilities. Good to have: Experience with cloud-based data platforms like Azure or AWS. Familiarity with other BI tools like Power BI or Qlik. Experience with Python or R for advanced analytics integration with Tableau. Certification in Tableau (e.g., Tableau Desktop Certified Professional, Tableau Server Certified Associate).

Senior Platform Engineer pune 8 - 13 years INR 30.0 - 35.0 Lacs P.A. Work from Office Full Time

About the Role We are seeking a seasoned Senior Platform Engineer / Architect with deep expertise in Databricks, Unity Catalog, and Azure Data Factory (ADF), who can balance hands-on delivery with strategic leadership. This role requires strong technical capability combined with the ability to engage business stakeholders, drive platform modernization conversations, and provide thought leadership on data governance and engineering best practices. Key Responsibilities Architect and lead platform modernization initiatives, with a focus on Databricks Unity Catalog, data governance, and enterprise-grade scalability. Drive the design and implementation of data pipelines using Azure Data Factory, integrating diverse data sources into the Lakehouse ecosystem. Partner with business and technology stakeholders to define a strategic roadmap for platform evolution, governance, and self-service enablement. Provide thought leadership on data access, lineage, discoverability, and platform observability, ensuring compliance with enterprise security standards. Standardize and scale data ingestion, orchestration, and transformation frameworks using ADF, Spark, and Databricks. Lead reference architecture development, technical evaluations, and POCs to validate future-state designs. Collaborate with engineering teams on CI/CD, infrastructure-as-code, monitoring, and automation to ensure operational excellence. Mentor teams on best practices across data engineering, platform configuration, and governance workflows. Required Qualifications 8+ years of experience in data platform engineering or architecture, with demonstrable hands-on work in enterprise environments. Expertise in Databricks and Unity Catalog, including data security, access control, governance, and catalog management. Proven experience designing and orchestrating pipelines using Azure Data Factory, including parameterized and dynamic pipelines. Deep knowledge of Azure services, including ADLS Gen2, Azure DevOps, Key Vault, and networking considerations. Hands-on skills in Python/Scala, Spark, SQL, and data orchestration tools. Experience with infrastructure-as-code (Terraform, Bicep, ARM templates) and CI/CD automation. Strong communication skills, with a proven track record of driving business-facing conversations and aligning stakeholders on platform decisions.

Associate Director of Finance Controllership bengaluru 10 - 20 years INR 20.0 - 25.0 Lacs P.A. Work from Office Full Time

to lead our global accounting operations and financial compliance across the United States, India, Canada, and Argentina. This role is critical in ensuring accurate financial reporting, robust internal controls, and adherence to regulatory standards across multiple geographies. The ideal candidate brings deep controllership experience, preferably within a professional services organization , and thrives in a fast-paced, people-driven, multi-entity environment. Key Responsibilities: Financial Reporting & Close Management: Lead the global month-end, quarter-end, and year-end close processes across multiple legal entities, ensuring accuracy and timeliness. Consolidate financial statements across jurisdictions in compliance with US GAAP (or IFRS, as applicable). Oversee preparation of key financial reports for leadership, board, and audit purposes. Global Accounting Operations: Manage accounting processes including revenue recognition (project-based and time & materials), expense accruals, intercompany eliminations, and foreign currency transactions. Ensure compliance with local statutory and tax regulations in India, Canada, and Argentina in coordination with external advisors and internal tax teams. Maintain and improve the global chart of accounts and financial system configuration (ERP). Internal Controls & Compliance: Develop and monitor internal controls over financial reporting (ICFR), including processes for SOX compliance if applicable. Partner with internal and external auditors on annual audit processes and internal control reviews. Support local finance teams with regulatory filings, statutory audits, and transfer pricing compliance. Team Leadership & Collaboration: Manage and mentor a globally distributed accounting team, promoting knowledge sharing and standardization of processes across regions. Collaborate with FP&A, Legal, HR, and Operations to ensure alignment of financial practices and operational needs. Act as the finance lead on cross-border projects, including entity setup, systems integration, and process automation. Technical Accounting & Policy: Provide guidance on technical accounting issues such as lease accounting, equity compensation, and revenue recognition under ASC 606. Maintain and update accounting policies and ensure consistent application across regions and business units. Process Improvement & Systems: Drive process improvements and automation in financial workflows, reconciliations, and reporting. Play a key role in ERP enhancements or implementations (NetSuite). Qualifications: Education & Certification: CA or CPA certification required. Experience: 10+ years of progressive accounting and controllership experience, with at least 5 years in a leadership role. Experience in a global professional services or consulting firm is strongly preferred. Proven success managing multi-entity, multi-currency operations across North America, Latin America, and Asia. Solid knowledge of US GAAP; Technical & Interpersonal Skills: Deep understanding of revenue recognition for service-based businesses (especially ASC 606). Strong ERP experience ( NetSuite) and Excel proficiency. Excellent communication and collaboration skills across cultures and time zones. Upload Your Profile Last Working Day Joining Date If Offered Supported file types *.doc, *.docx, *.pdf Uploading profile We have received your profile. Our Recruitment team will get in touch with you. Thank You.

Salesforce Admin pune 4 - 6 years INR 6.0 - 8.0 Lacs P.A. Work from Office Full Time

Must have 4-6 years of Salesforce development experience (configuration experience) Knowledge of Salesforce Configurations Ability to design Objects and Relationships Knows the security model in Salesforce Understands QA processes SOQL query. Exposure to flows Knows best practices on process builders Record sharing and groups

Azure Data Lead pune 5 - 9 years INR 9.0 - 13.0 Lacs P.A. Work from Office Full Time

Job ID: Azu-ETP-Pun-1090 Location: Pune,Other The Role: As a Lead Data Engineer specializing in Databricks, you will be a key player in designing, developing, and optimizing our next-generation data platform You will lead a team of data engineers, providing technical guidance, mentorship, and ensuring the scalable, and high-performance data solutions, Key Responsibilities Technical Leadership: Lead the design, development, and implementation of scalable and reliable data pipelines using Databricks, Spark, and other relevant technologies, Define and enforce data engineering best practices, coding standards, and architectural patterns, Provide technical guidance and mentorship to junior and mid-level data engineers, Conduct code reviews and ensure the quality, performance, and maintainability of data solutions, Databricks Expertise: Architect and implement data solutions on the Databricks platform, including Databricks Lakehouse, Delta Lake, and Unity Catalog, Optimize Spark workloads for performance and cost efficiency on Databricks, Develop and manage Databricks notebooks, jobs, and workflows, Proficiently use Databricks features such as Delta Live Tables (DLT), Photon, and SQL Analytics, Pipeline Development & Operations: Develop, test, and deploy robust ETL/ELT pipelines for data ingestion, transformation, and loading from various sources (e-g , relational databases, APIs, streaming data), Implement monitoring, alerting, and logging for data pipelines to ensure operational excellence, Troubleshoot and resolve complex data-related issues, Collaboration & Communication: Work closely with cross-functional teams including product managers, data scientists, and software engineers, Communicate complex technical concepts clearly to both technical and non-technical stakeholders, Stay updated with industry trends and emerging technologies in data engineering and Databricks Primary Skills Extensive hands-on experience with Databricks platform, including Databricks Workspace, Spark on Databricks, Delta Lake, and Unity Catalog, Strong proficiency in optimizing Spark jobs and understanding Spark architecture, Experience with Databricks features like Delta Live Tables (DLT), Photon, and Databricks SQL Analytics, Deep understanding of data warehousing concepts, dimensional modeling, and data lake architectures, Show more Show less

Senior Platform Engineer pune,bengaluru 8 - 13 years INR 25.0 - 35.0 Lacs P.A. Work from Office Full Time

About the Role We are seeking a seasoned Senior Platform Engineer / Architect with deep expertise in Databricks, Unity Catalog, and Azure Data Factory (ADF), who can balance hands-on delivery with strategic leadership. This role requires strong technical capability combined with the ability to engage business stakeholders, drive platform modernization conversations, and provide thought leadership on data governance and engineering best practices. Key Responsibilities Architect and lead platform modernization initiatives, with a focus on Databricks Unity Catalog, data governance, and enterprise-grade scalability. Drive the design and implementation of data pipelines using Azure Data Factory, integrating diverse data sources into the Lakehouse ecosystem. Partner with business and technology stakeholders to define a strategic roadmap for platform evolution, governance, and self-service enablement. Provide thought leadership on data access, lineage, discoverability, and platform observability, ensuring compliance with enterprise security standards. Standardize and scale data ingestion, orchestration, and transformation frameworks using ADF, Spark, and Databricks. Lead reference architecture development, technical evaluations, and POCs to validate future-state designs. Collaborate with engineering teams on CI/CD, infrastructure-as-code, monitoring, and automation to ensure operational excellence. Mentor teams on best practices across data engineering, platform configuration, and governance workflows. Required Qualifications 8+ years of experience in data platform engineering or architecture, with demonstrable hands-on work in enterprise environments. Expertise in Databricks and Unity Catalog, including data security, access control, governance, and catalog management. Proven experience designing and orchestrating pipelines using Azure Data Factory, including parameterized and dynamic pipelines. Deep knowledge of Azure services, including ADLS Gen2, Azure DevOps, Key Vault, and networking considerations. Hands-on skills in Python/Scala, Spark, SQL, and data orchestration tools. Experience with infrastructure-as-code (Terraform, Bicep, ARM templates) and CI/CD automation. Strong communication skills, with a proven track record of driving business-facing conversations and aligning stakeholders on platform decisions.

Python Lead pune 8 - 12 years INR 10.0 - 14.0 Lacs P.A. Work from Office Full Time

We are looking for members with hands-on Python experience who will work on the internal and customer-based projects for Bridgenext. We are looking for someone who cares about the quality of code and who is passionate about providing the best solution to meet the client needs and anticipate their future needs based on an understanding of the market. Someone who worked on platform development projects including processing using various AWS Services. Must Have Skills: 8-12 years of overall experience Strong programming experience with Python and ability to write modular code following best practices in python which is backed by unit tests with high degree of coverage. Knowledge of source control(Git/Gitlabs) Understanding of deployment patterns along with knowledge of CI/CD and build tools Knowledge of Kubernetes concepts and commands is a must Knowledge of monitoring and alerting tools like Grafana, Open telemetry is a must Knowledge of Astro/Airflow is plus Knowledge of data governance is a plus Experience with Cloud providers, preferably AWS Experience with PySpark, Snowflake and DBT good to have. Professional Skills: Solid written, verbal, and presentation communication skills Strong team and individual player Maintains composure during all types of situations and is collaborative by nature High standards of professionalism, consistently producing high-quality results Self-sufficient, independent requiring very little supervision or intervention Demonstrate flexibility and openness to bring creative solutions to address issues

Data Engineer Python, Airflow, Spark pune,bengaluru 3 - 5 years INR 15.0 - 20.0 Lacs P.A. Work from Office Full Time

Bridgnext is a Global consulting company that provides technology-empowered business solutions for world-class organizations. Our Global Workforce of over 800 consultants provide best in class services to our clients to realize their digital transformation journey. Our clients span the emerging, mid-market and enterprise space. With multiple offices worldwide, we are uniquely positioned to deliver digital solutions to our clients leveraging Microsoft, Java and Open Source with a focus on Mobility, Cloud, Data Engineering and Intelligent Automation. Emtec s singular mission is to create Clients for Life long-term relationships that deliver rapid, meaningful, and lasting business value. At Bridgnext, we have a unique blend of Corporate and Entrepreneurial cultures. This is where you would have an opportunity to drive business value for clients while you innovate and continue to grow and have fun while doing it. You would work with team members who are vibrant, smart and passionate and they bring their passion to all that they do whether it s learning, giving back to our communities or always going the extra mile for our client. Position Description We are looking for members with hands-on Data Engineering experience who will work on the internal and customer-based projects for Bridgnext. We are looking for someone who cares about the quality of code and who is passionate about providing the best solution to meet the client needs and anticipate their future needs based on an understanding of the market. Someone who worked on Hadoop projects including processing and data representation using various AWS Services. Must Have Skills: 3-5years of overall experience Strong programming experience with Python, Spark Experience with unit testing, debugging, and performance tuning. Experience with cloud platforms (AWS preferred) Experience with CI/CD Gitlab pipelines Familiarity with workflow management tools like Airflow. Experience with Docker, Kubernetes, and DBT is a plus. Good to have experience with infrastructure-as-a-code technologies such as Terraform, Ansible Good to have experience in Snowflake modelling roles, schema, databases. Professional Skills: Solid written, verbal, and presentation communication skills Strong team and individual player Maintains composure during all types of situations and is collaborative by nature High standards of professionalism, consistently producing high-quality results Self-sufficient, independent requiring very little supervision or intervention Demonstrate flexibility and openness to bring creative solutions to address issues Upload Your Profile Joining Date If Offered Supported file types *.doc, *.docx, *.pdf Uploading profile We have received your profile. Our Recruitment team will get in touch with you. Thank You.

Lead Software Engineer pune 8 - 13 years INR 25.0 - 30.0 Lacs P.A. Work from Office Full Time

We re building the next generation of agentic AI platforms voice-first, adaptive, and outcome-driven. Our platform-as-a-service (PaaS) solution enables businesses to deploy intelligent, autonomous agents that drive real value through voice-enabled interfaces. We re seeking a visionary Lead Software Engineer to guide our technical strategy and help build systems that scale with performance and precision. Role Overview As a Lead Software Engineer , you will drive the development and delivery of core components in our agentic AI platform. You will play a hands-on leadership role in designing, building, and optimizing voice agent technologies, while fostering a high-performance engineering culture focused on delivering measurable business outcomes. This is a high-impact role that requires technical depth, leadership, and a bias for action. Key Responsibilities Lead architecture and full-stack development of PaaS solutions in the agentic AI and voice agent space. Collaborate cross-functionally with product, design, and business teams to define and deliver strategic initiatives. Drive hands-on development, including writing and reviewing code across the stack from backend services to voice interface integrations. Champion CI/CD best practices and modern DevOps workflows to ensure rapid, reliable, and secure deployment cycles. Identify and integrate emerging technologies, especially generative AI, as accelerators for product development and quality assurance. Mentor and inspire engineers to reach high standards of performance, ownership, and technical excellence. Ensure that all solutions are scalable, maintainable, and aligned with business goals and user needs. Required Qualifications 8+ years of experience in software engineering, with at least 3 years in a technical leadership or team lead role. Proven track record of delivering PaaS solutions in the agentic AI or voice AI domain. Deep expertise in voice agent technologies (e.g., ASR, TTS, NLU/NLP systems, conversational AI frameworks). Demonstrated success in driving projects that resulted in clear and measurable business outcomes. Strong proficiency in full-stack development (e.g., Node.js, Python, React, TypeScript, etc.). Experience building and maintaining robust CI/CD pipelines (e.g., GitHub Actions, Jenkins, CircleCI). Hands-on experience leveraging generative AI tools (e.g., LLMs, code generation, prompt engineering) to enhance engineering velocity. Excellent communication, collaboration, and team-building skills. Preferred Qualifications Experience with cloud-native architectures (AWS, GCP, or Azure). Familiarity with real-time systems, WebRTC, or telephony integrations (e.g., Twilio, SIP). Background in human-computer interaction, applied ML/AI, or systems for autonomous agents. Startup or high-growth environment experience.