--- pipeline_tag: multilabel-classification --- Multi Label Classification Description: Labeling the product comments of the e-commerce data we have created according to the labels we have specified. It returns us a positive or negative value based on the comments made on the product or shipping. If no comment is made, None data regarding the product or shipping will be returned. In this project I used the model "meta-llama/Meta-Llama-3-8B-Instruct" Used to quantization method like this ; import torch #quantizasyon yöntemi 4 bit hassasiyetiyle yapacak. quant_config = BitsAndBytesConfig( load_in_4bit=True, #olabildiğince küçültmeye çalışıyoruz maliyet açısından, ama doğruluğu da düşüyor bnb_4bit_quant_type="nf4", #"fp4" , nf4 daha az yer kaplar daha hızlıdır ama doğruluğu kıyasla daha az bnb_4bit_compute_dtype=torch.float16, #maliyeti de düşürmüş olduk, hızı da artırdık bnb_4bit_use_double_quant=True, # True model increases accuracy but causes it to work more costly. ) #np4 saves more memory Here.. This code contains important steps to optimize LLM's memory usage and processing time. It is important to improve the performance and resource utilization of the model. #20GB model model = AutoModelForCausalLM.from_pretrained( pretrained_model_name_or_path=base_model, #used "meta-llama/Meta-Llama-3-8B-Instruct" quantization_config=quant_config, device_map={"": 0}, #use_auth_token=True ) model.config.use_cache = False model.config.pretraining_tp = 1