Nirmalya Ghosh Problem Solver | Technologist

Using Mixtral 8x7B For NLP Tasks On Small GPUs

Large language models (LLM) are made up of billions of parameters, thus posing challenges when loading them onto GPU memory for model inference or fine-tuning. This post briefly explains the challenges and describes a solution to load Mixtral 8x7B, a State-of-the-art (SOTA) LLM, onto consumer-grade GPUs, followed by using the model for NLP tasks such as Named Entity Recognition (NER), Sentiment Analysis, and Text Classification.

Continue reading ...

13 Ways To Speedup Python Loops

A few simple ways to achieve 1.3x to 970x speedup of Python for loops with minimal effort.

Continue reading ...

High-Quality Annotations For Custom NER, With Reduced Human Effort : Using ChatGPT

Developing custom Named Entity Recognition (NER) models for specific use cases depend on the availability of high-quality annotated datasets, which can be expensive. As someone who has worked on several real-world use cases, I know the challenges all too well. This post describes a few real-world challenges, a solution which reduces human effort whilst maintaining high quality, and code snippets for the solution.

Continue reading ...

Coreference Resolution

Coreference resolution refers to the task of identifying all the expressions in a text that refer to the same entity, such as pronouns, nouns, or noun phrases, and linking them to their referring entity. This post, inspired by a real world problem, describes a few challenges and explores a few approaches, along with code snippets.

Continue reading ...