TransOrg Analytics is a data analytics and consulting firm specializing in leveraging data science to drive strategic business decisions. They offer a range of services including predictive analytics, business intelligence, and operational analytics.
Bengaluru, Mumbai (All Areas)
INR 10.0 - 20.0 Lacs P.A.
Work from Office
Full Time
Domain: Retail Banking / Credit Cards Location: Mumbai/ Bengaluru Experience: 3-5 years Industry: Banking / Financial Services (Mandatory) Why would you like to join us? TransOrg Analytics specializes in Data Science, Data Engineering and Generative AI, providing advanced analytics solutions to industry leaders and Fortune 500 companies across India, US, APAC and the Middle East. We leverage data science to streamline, optimize, and accelerate our clients' businesses. Visit at www.transorg.com to know more about us. What do we expect from you? Build and validate credit risk models , including application scorecards and behavior scorecards (B-score), aligned with business and regulatory requirements. Use advanced machine learning algorithms such as Logistic Regression, XGBoost , and Clustering to develop interpretable and high-performance models. Translate business problems into data-driven solutions using robust statistical and analytical methods. Collaborate with cross-functional teams including credit policy, risk strategy, and data engineering to ensure effective model implementation and monitoring. Maintain clear, audit-ready documentation for all models and comply with internal model governance standards. Track and monitor model performance, proactively suggesting recalibrations or enhancements as needed. What do you need to excel at? Writing efficient and scalable code in Python, SQL, and PySpark for data processing, feature engineering, and model training. Working with large-scale structured and unstructured data in a fast-paced, banking or fintech environment. Deploying and managing models using MLFlow, with a strong understanding of version control and model lifecycle management. Understanding retail banking products , especially credit card portfolios , customer behavior, and risk segmentation. Communicating complex technical outcomes clearly to non-technical stakeholders and senior management. Applying a structured problem-solving approach and delivering insights that drive business value. What are we looking for? Bachelors or masters degree in Statistics, Mathematics, Computer Science , or a related quantitative field. 35 years of experience in credit risk modelling , preferably in retail banking or credit cards. Hands-on expertise in Python, SQL, PySpark , and experience with MLFlow or equivalent MLOps tools. Deep understanding of machine learning techniques including Logistic Regression, XGBoost, and Clustering. Proven experience in developing Application Scorecards and behavior Scorecards using real-world banking data. Strong documentation and compliance orientation, with an ability to work within regulatory frameworks. Curiosity, accountability, and a passion for solving real-world problems using data. Cloud Knowledge, JIRA, GitHub(good to have)
Gurugram, Delhi / NCR
INR 8.0 - 12.0 Lacs P.A.
Work from Office
Full Time
Job Title: Power Apps Developer (35 Years Experience) Location: Gurgaon Job Type: Full-Time Job Summary: We are seeking a skilled and proactive Power Apps Developer with 3 to 5 years of experience in building robust, scalable applications using Microsoft Power Platform. The ideal candidate will bring strong expertise in Power Apps (Canvas and Model-Driven), with a functional orientation towards web and app development . Familiarity with Azure services and Snowflake is essential, as you will be working on data-rich applications integrated into enterprise cloud environments. You would be responsible for designing, developing, testing, deploying, and maintaining Power Apps solutions to meet business requirements. You would collaborate with stakeholders, develop complex data models, integrate them with Power Apps, and ensure applications are performant, scalable, and secure. This includes utilizing various Power Platform tools like Power Apps, Power Automate, and Power BI. Key Responsibilities: Design, develop, and maintain scalable apps using Microsoft Power Apps (Canvas & Model-Driven). Collaborate with stakeholders to gather and analyze functional requirements and translate them into user-friendly business applications. Integrate Power Apps solutions with Azure services , Snowflake , SharePoint, Dataverse, and other cloud data sources. Build and manage Power Automate flows to automate business processes. Implement responsive and intuitive UI/UX for mobile and web platforms. Participate in system architecture planning and contribute to best practices in pplication design and lifecycle management. Troubleshoot, debug, and resolve technical issues across environments. Maintain documentation, deployment scripts, and operational support materials. Work closely with cross-functional teams including analysts, cloud engineers, and data teams. Required Qualifications: 2-3 years of hands-on experience in Power Apps development. Strong knowledge of Power Platform components including Power Apps, Power Automate, and Dataverse. Proficiency in JavaScript ( Experience with JavaScript frameworks like React.js for front-end development ), HTML, and CSS for custom front-end functionalities. Experience integrating with Azure services such as Azure Functions, Logic Apps, and Key Vault. Expertise in ASP.net front end, SAP and SQL database Hands-on experience with Snowflake and its integration into apps or workflows. Strong understanding of REST APIs, connectors, and integration patterns. Excellent problem-solving skills and a user-centric mindset. Ability to communicate technical concepts effectively with non-technical stakeholders. RPA: Understanding of Robotic Process Automation (RPA). PCF: Experience with Power Apps Component Framework (PCF). OOB Connectors: Experience with Out-of-the-box (OOB) connectors in Power Apps. Premium Connectors: Experience with premium connectors in Power Apps. Preferred Qualifications: Microsoft Power Platform certifications (e.g., PL-100, PL-400). Familiarity with Agile methodologies and tools like Azure DevOps or Jira. Knowledge of data governance, security roles, and access control within Power Platform. Experience with CI/CD pipelines for Power Platform.
Gurugram
INR 11.0 - 15.0 Lacs P.A.
Work from Office
Full Time
Pickl.AI (TransOrgs education brand) is looking for an instructor who is technically immersed in data science/data engineering as subjects. We are looking for a creative instructor who wants to accelerate their exposure to many areas in Machine Learning, loves wearing multiple hats and can take full ownership of their work. Responsibilities: • Design and deliver data science and data engineering training programs to students at Pickl.AI partner institutions • Teach and mentor students on topics such as data analysis, machine learning, statistics, data visualisation, and other relevant topics • Create and develop instructional materials, including lesson plans, presentations, assignments, and assessments • Keep up-to-date with the latest developments in data science and incorporate new and emerging trends into the curriculum • Include hands-on and relevant case studies in the topics that you are teaching • Provide guidance and support to students throughout their learning journey, including answering questions and providing feedback on assignments and projects • Collaborate with other instructors and team members to continuously improve the curriculum and training programs • Participate in meetings and training sessions to enhance instructional skills and techniques • Maintain accurate and up-to-date records of student attendance, progress, and grades Requirements: • Master's degree or Ph.D. in Computer Science, Data Science, Statistics, or a related field would be preferred • Excellent knowledge and understanding of data science concepts, techniques, and tools • Strong presentation and communication skills • Ability to work independently and in a team environment • Experience in teaching or mentoring students in a classroom or online setting is a plus • Passion for teaching and helping others learn. About the company: TransOrg Analytics has over a decade of specialization in machine learning and data science consulting. Pickl.AI is the education brand of TransOrg Analytics. Visit us at https://pickl.ai and www.transorg,com for details
Gurugram
INR 15.0 - 30.0 Lacs P.A.
Work from Office
Full Time
Why would you like to join us? TransOrg Analytics specializes in Data Science, Data Engineering and Generative AI, providing advanced analytics solutions to industry leaders and Fortune 500 companies across India, US, APAC and the Middle East. We leverage data science to streamline, optimize, and accelerate our clients' businesses. Visit at www.transorg.com to know more about us. What do we expect from you? - Analyze & interpret data and communicate results to clients, often with the aid of mathematical/statistical techniques and software. - Data exploration, statistical/ predictive modelling, data analysis and hypothesis testing. Design, development and deployment of Predictive models and frameworks. - Complex statistical concepts are explained in a way that clients can understand and advice on strategy. - Overall data analytics strategy and business development. - Responsible for providing thought leadership and support to the team to develop best in class analytics solutions for clients. - Responsible for managing existing relationships with multiple clients in various industries. - Responsible for working closely with the senior management in developing and implementing strategic plans. Team Development: - Manage a team of data Scientists. Lead team members, including personal development and set work direction. - Lead team members, including personal development and set work direction. - Group development by building expertise in new domains and processes. - Coaching and mentoring team members. - Responsible for productivity, utilization and quality of output of the group. - Development of training modules and imparting training to the group. Project Delivery: - Lead analytics engagements and responsible for delivering the analytics solutions required across multiple clients & projects. - Provide thought leadership in a number of areas, including predictive modeling, customer segmentation, reporting & analysis, data processing and cleansing. - Demonstrate strong functional expertise by providing content leadership on projects. - Hands-on experience in implementing machine-learning algorithms. - Improve existing delivery processes and lead automation where applicable. What are we looking for? - Bachelor's in Computer Science/ Engineering, Statistics, Math or any other relevant degree. - 5-9 years of total experience with at least 3+ yrs. of relevant experience in data science/technical experience. - Experience on multiple modeling techniques -Logistic Regression, Linear Regression, Random Forest, Boosting, Neural Networks etc. - Hands-on experience with Python and SQL is a must. - Experience with data visualization tools like Tableau or Power BI. - Team and client management experience. - Experience of stakeholder management - Experience in working with unstructured data and text analytics. - Must have a passion for data, structured and unstructured. - Should have sound experience in data mining and data analysis. Good presentation skills. -Track record of managing data project delivery including ability to meet deadlines, overcome challenges, manage stakeholder expectations, and produce clear deliverables -Strong problem-solving skills with a keen attention to detail. Ability to think critically and provide data-backed insights. - Excellent communication skills, both verbal and written. -Understanding of Cloud Platforms (e.g. Azure / AWS/GCP) and ability to use them for developing, training and testing deep learning models ( good to have) -Familiarity with cloud-based data warehousing platforms (snowflake) ( good to have)
Gurgaon/ Gurugram, Mumbai (All Areas)
INR 10.0 - 20.0 Lacs P.A.
Work from Office
Full Time
Why would you like to join us? TransOrg Analytics specializes in Data Science, Data Engineering and Generative AI, providing advanced analytics solutions to industry leaders and Fortune 500 companies across India, US, APAC and the Middle East. We leverage data science to streamline, optimize, and accelerate our clients' businesses. Visit at www.transorg.com to know more about us. Job Location Gurgaon /Mumbai What do we expect from you? Analyze & interpret data and communicate results to clients, often with the aid of Mathematical/statistical techniques and software. This role requires: Data exploration, statistical/ predictive modelling, data analysis and hypothesis testing. Design, development and deployment of Predictive models and frameworks. Complex statistical concepts are explained in a way that clients can understand and advice on strategy. What do you need to excel at? Well-versed in quantitative analysis, data mining, trend analysis, customer profiling, clustering and predictive modelling. Executing the data-driven planning process by building models and frameworks that connect business unit drivers to company financials and forecast to take the correct decision as per the business need. Should be able to handle assigned tasks in the capacity of individual contributor. Demonstrate excellent teamwork. Collaborate with other data analysts to provide development coverage, support, and Knowledge sharing and mentoring of junior team members. What are we looking for? Bachelor's in Computer Science, Engineering, Statistics, Maths or related quantitative degree. 3-5 years of relevant experience in a data analysis or related role. Experience in BFSI and CPG Domain is Preferred. Experience of data pre-processing and data manipulation using Python or SQL Should have some experience of statistical analysis and modeling techniques (e.g., regression, classification, clustering). Experience with machine learning, deep learning frameworks and libraries ( good to have) Familiarity with data processing tools and frameworks Experience with data visualization tools like Tableau or Power BI. Track record of managing data project delivery including ability to meet deadlines, overcome challenges, manage stakeholder expectations, and produce clear deliverables Strong problem-solving skills with a keen attention to detail. Ability to think critically and provide data-backed insights. Excellent communication skills, both verbal and written. Understanding of Cloud Platforms (e.g. Azure / AWS/GCP) and ability to use them for developing, training and testing deep learning models ( good to have) Familiarity with cloud-based data warehousing platforms (snowflake) ( good to have)
Gurugram, Bengaluru, Mumbai (All Areas)
INR 12.0 - 20.0 Lacs P.A.
Work from Office
Full Time
Job Description_ Data Engineer _ TransOrg Analytics Why would you like to join us? TransOrg Analytics specializes in Data Science, Data Engineering and Generative AI, providing advanced analytics solutions to industry leaders and Fortune 500 companies across India, US, APAC and the Middle East. We leverage data science to streamline, optimize, and accelerate our clients' businesses. Visit at www.transorg.com to know more about us. Responsibilities: Design, develop, and maintain robust data pipelines using Azure Data Factory and Databricks workflows. Develop an integrated data solution in Snowflake to unify data. Implement and manage big data solutions using Azure Databricks. Design and maintain relational databases using Azure Delta Lake. Ensure data quality and integrity through rigorous testing and validation. Monitor and troubleshoot data pipelines and workflows to ensure seamless operation. Implement data security and compliance measures in line with industry standards. Continuously improve data infrastructure (including CI/CD) for scalability and performance. Design, develop, and maintain ETL processes to extract, transform, and load data from various sources into Snowflake. Utilize ETL tools (e.g., ADF, Talend) to automate and manage data workflows. Develop and maintain CI/CD pipelines using GitHub and Jenkins for automated deployment of data models and ETL processes. Monitor and troubleshoot pipeline issues to ensure smooth deployment and integration. Design and implement scalable and efficient data models in Snowflake. Optimize data structures for performance and storage efficiency. Collaborate with stakeholders to understand data requirements and ensure data integrity Integrate multiple data sources to create data lake/data mart Perform data ingestion and ETL processes using SQL, Scoop, Spark or Hive Monitor job performances, manage file system/disk-space, cluster & database connectivity, log files, manage backup/security and troubleshoot various user issues Design, implement, test and document performance benchmarking strategy for platforms as well as for different use cases Setup, administer, monitor, tune, optimize and govern large scale implementations Drive customer communication during critical events and participate/lead various operational improvement initiatives Qualifications, Skill Set and competencies: Bachelor's in Computer Science, Engineering, Statistics, Maths or related quantitative degree. 2 - 5 years of relevant experience in data engineering. Must have worked on any of the cloud engineering platforms - AWS, Azure, GCP, Cloudera Proven experience as a Data Engineer with a focus on Azure cloud technologies/Snowflake. Strong proficiency in Azure Data Factory, Azure Databricks, ADLS, and Azure SQL Database. Experience with big data processing frameworks like Apache Spark. Expert level proficiency in SQL and experience with data modeling and database design. Knowledge of data warehousing concepts and ETL processes. Strong focus on PySpark, Scala and Pandas. Proficiency in Python programming and experience with other data processing frameworks. Solid understanding of networking concepts and Azure networking solutions. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Azure Data Engineer certification AZ-900 and DP-203 (Good to have) Familiarity with DevOps practices and tools for CI/CD in data engineering. Certification: MS Azure / DBR Data Engineer (Good to have) Data Ingestion - Coding & automating ETL pipelines, both batching & streaming. Should have worked on both ETL or ELT methodologies using any of traditional & new age tech stack- SSIS, Informatica, Databricks, Talend, Glue, DMS, ADF, Spark, Kafka, Storm, Flink etc. Data transformation - Experience working with MPPs, big data & distributed computing frameworks on cloud or cloud agnostic tech stack- Databricks, EMR, Hadoop, DBT, Spark etc, Data storage - Experience working on data lakes, lakehouse architecture- S3, ADLS, Blob, HDFS DWH - Strong experience modelling & implementing DWHing on tech like Redshift, Snowflake, Azure Synapse, Bigquery, Hive Orchestration & lineage - Airflow, Oozie etc.
FIND ON MAP
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.