Moodooyinka Luuqadaha waaweyn ayaa si cajiib ah u awood badan waxaana cajiib ah in lagu tababaray xog ururin ballaaran. Si kastaba ha ahaatee, imisa naga mid ah, inta aan ku guda jirno safarkayaga Aasaasaha, ayaa haysta miisaaniyad lagu tababaro moodooyinkan gudaha guriga? Malaha, aad u yar.
Laakin sug, LLM-yadan horay loo tababaray miyay anaga na caawinayaan? Dabcan, haddii ay yihiin il furan. Nasiib wanaag, in yar ayaa hadda la heli karaa.
Haddaba, sidee bay dhab ahaan u caawinayaan? Inta badan oo naga mid ah goobta AI waa ka warqabnaa RAGs ama ugu yaraan maqalnay iyaga. Aan bixiyo qeexid hal-line fudud. Jiilka dib-u-soo-celinta-kordhinta ah waa sida ay u egtahay - waxay ka soo saartaa xogta ilaha dibadda ee aan bixinno waxayna ku kordhisaa wax soo saarka LLM.
Aad bay faa'iido u leedahay sababtoo ah waxay ka faa'iidaysataa awoodda wax-soo-saarka LLM-yada iyadoo lagu darayo aqoonta aan dibadda ka bixinno si aan uga soo saarno wax-soo-saarka korpus la rabo. Haddii aad leedahay corpus dibadda ah oo xaddidan, waxaan u oggolaan karnaa moodalku inuu dib ugu laabto aqoonta guud ee LLM.
Waxaan si gaar ah u xiiseynayaa sida aan u cunno oo aan si qoto dheer u aaminsanahay fikradda "qashinka gudaha, qashinka" marka ay timaado cuntada iyo jirka. Haddii aan nafteena ku quudinno cuntooyinka dabiiciga ah ee caafimaadka qaba, waxaynu ka tarjumaynaa dabeecadda - xoog leh, firfircoon, oo aan la joojin karin. Laakiin haddii aan isticmaalno cunto macmal ah, oo aan naf lahayn, waxaan bilaabeynaa inaan u ekaano oo dareemo isku mid - dareere iyo aan dabiici ahayn. Mid ka mid ah cawaaqib xumada ugu xun ee isticmaalka xad-dhaafka ah ee cuntada macmalka ah iyo isticmaalka cuntada la sifeeyay maanta waa sonkorowga.
Oo yaa si dhab ah u fahmay dhibcaha dhabta ah ee xanuunka macaanka? Way fududahay - dadka ay la kulmaan marka hore. Marka la eego xiisaha aan u qabo helitaanka xogta waayo-aragnimada ee sonkorowga iyada oo la adeegsanayo LLMs, waxaan tijaabadan ku sameeyay Ollama - mid ka mid ah ilo furan oo badan oo LLM ah oo loo isticmaali karo hawlahan oo kale.
Waxaan la wadaagayaa buug-yarahayga tallaabo-tallaabo iyadoo sharraxaysa marxalad kasta. Intaa waxaa dheer, si loo fahmo fahamka, waxaan ku daraa jaantuska qaab dhismeedka heerka sare ah.
Talaabada 1: Ka soo qaado oo ka baar macluumaadka URLs badan. Waxaan ka xoqin qoraalka liiska la bixiyay ee URL-yada Reddit oo ku kaydinnaa all_texts.
# Import necessary libraries import requests from bs4 import BeautifulSoup from langchain.text_splitter import RecursiveCharacterTextSplitter from langchain.embeddings import OllamaEmbeddings from langchain.vectorstores import FAISS from langchain.chains import RetrievalQA from langchain.prompts import PromptTemplate from langchain.llms import Ollama # List of URLs to scrape urls = [ 'https://www.reddit.com/r/diabetes/comments/1broigp/what_are_your_biggest_frustrations_with_diabetes/', 'https://www.reddit.com/r/diabetes_t2/comments/156znkx/whats_the_most_challenging_part_about_dealing/', 'https://www.reddit.com/r/diabetes/comments/qcsgji/what_is_the_hardest_part_about_managing_diabetes/', 'https://www.reddit.com/r/diabetes_t1/comments/1hdlipr/diabetes_and_pain/', 'https://www.reddit.com/r/diabetes/comments/ww6mrj/what_does_diabetic_nerve_pain_feel_like/', 'https://www.reddit.com/r/AskReddit/comments/avl1x0/diabetics_of_reddit_what_is_your_experience_of/', 'https://www.reddit.com/r/diabetes_t2/comments/1jggxi9/my_fathers_sugar_levels_are_not_dropping/', 'https://www.reddit.com/r/diabetes_t2/comments/1jglmie/shaky_feeling/', 'https://www.reddit.com/r/diabetes_t2/comments/1jgccvo/rant_from_a_depressedeating_disordered_diabetic/' ] # Initialize text storage all_texts = [] # Step 1: Fetch and process content from multiple URLs for url in urls: response = requests.get(url) soup = BeautifulSoup(response.text, 'html.parser') # Extract text from <p> tags text = ' '.join([para.get_text() for para in soup.find_all('p')]) if text: # Store only if text is found all_texts.append(text)
Tallaabada 2: Waxaan u kala qaybinay qoraalka la xoqay qaybo badan oo la maarayn karo si loogu habeeyo qoraalka hufan. Waxaan sidoo kale hoos u dhigi karnaa isticmaalka xusuusta iyo hagaajinta waxqabadka raadinta.
# Step 2: Split all content into chunks text_splitter = RecursiveCharacterTextSplitter(chunk_size=500, chunk_overlap=50) all_chunks = [] for text in all_texts: all_chunks.extend(text_splitter.split_text(text))
Talaabada 3: Waxaanu bilownay ku xidhidhiyaha Ollama. Isku-xidhku waa tirooyinka (vector) ee qoraalka cayriinka ah ee qaadanaya macnaha macnaha qoraalka. Ku-xidhnaanta waxay awood u siinaysaa hababka kala duwan ee moodooyinka ML waxayna u fahmaan qoraalka si wax ku ool ah. Heerka OllamaEmbeddings wuxuu ka faa'ideeyaa moodelka llama2 si uu u soo saaro waxyaabahan la dhexgaliyo.
# Step 3: Initialize Ollama embeddings embeddings = OllamaEmbeddings(model="llama2") # Adjust model name if needed
Talaabada 4: Raadinta isku midka ah ee Facebook AI (FAISS) waa maktabad loogu talagalay raadinta hufnaanta leh ee isku ekaanshaha iyo ururinta vectors cabbir sare. Shaqada from_texts waxay u beddeshaa qaybo ka mid ah qoraallada faleebo (laga bilaabo tallaabada 3) oo waxay ku kaydisaa dukaanka vector FAISS. FAISS waxay kaa caawinaysaa inaad hesho qaybo la mid ah su'aalahaaga adoo is barbar dhigaya masaafada vector (cosine, euclidean) qaab aad loo hagaajiyay.
# Step 4: Create a FAISS vector store using all chunks vector_store = FAISS.from_texts(all_chunks, embeddings)
Talaabada 5: Waxaanu bilownay Ollama LLM, kaas oo dhalin doona jawaabo ama habraac su'aalo ku salaysan aqoonta lagu kaydiyay kaydinta kaydka faa'iidada FAISS laga bilaabo talaabada 4.
# Step 5: Initialize the Ollama LLM llm = Ollama(model="llama2", temperature=0.3)
Si wada jir ah, tillaabooyinka 3-5 waxay awood u siinayaan RAG, halkaas oo LLM ay ka soo saari karto aqoonta qaybaha kaydsan ee dukaanka vector oo ay isticmaalaan macnaha la soo saaray si looga jawaabo weydiimaha isticmaalaha sida ugu habboon.
Tallaabada 6: Halkan, waxaan ku qeexnay shaqada ask_question_with_fallback si aan u waydiino aqoonta la kaydiyay si looga jawaabo su'aalaha isticmaalaha. Laakin haddii aysan heli karin dokumeenti la mid ah ama dhibcaha isku midka ah ay yar yihiin, waxay dib ugu dhici doontaa aqoonta guud ee hoose ee LLM (Ollama halkan).
# Step 6: Create the question-answering function def ask_question_with_fallback(query): # Retrieve relevant documents docs = vector_store.similarity_search(query, k=3) docs = vector_store.similarity_search(query, k=3) for doc in docs: print(f"Retrieved doc: {doc.page_content[:200]}") # If no relevant documents or low similarity, use general knowledge #if not docs or all(doc.metadata.get('score', 1.0) < 0.3 for doc in docs): # return use_general_knowledge(query) if not docs: return use_general_knowledge(query) # Format retrieved documents as context context = "\n\n".join([doc.page_content for doc in docs]) # Construct RAG prompt rag_prompt = f""" Use the following pieces of context to answer the question at the end. If you don't know the answer based on this context, respond with "NO_ANSWER_FOUND". Context: {context} Question: {query} Provide a direct and concise answer to the question based only on the context above: """ rag_answer = llm(rag_prompt) # Check for fallback trigger if "NO_ANSWER_FOUND" in rag_answer or "don't know" in rag_answer.lower() or "cannot find" in rag_answer.lower(): return use_general_knowledge(query) return { "answer": rag_answer, "source": "URL content", "source_documents": docs }
Talaabada 7: Tani waa shaqada dib u dhaca Haddii aan wax dukumeenti ah oo khuseeya lagala soo bixin karin tallaabada 6, LLM waxay isticmaashaa aqoonteeda guud si ay uga jawaabto weydiinta isticmaalaha.
# Step 7: Define fallback general knowledge function def use_general_knowledge(query): general_prompt = f""" Answer this question using your general knowledge: {query} Provide a direct and helpful response. If you don't know, simply say so. """ general_answer = llm(general_prompt) return { "answer": general_answer, "source": "General knowledge", "source_documents": [] }
Tallaabada 8: Talaabadani waxay tusinaysaa tusaale sida loo isticmaalo habkan RAG. Waxaad siisay su'aal moodeelka, LLM-na waxay isticmaashaa aqoonta dibadda ama gudaha si ay uga jawaabto su'aalahaaga.
#Step 8 # Example usage query = "What is the hardest part about managing diabetes?" # Replace with your actual question result = ask_question_with_fallback(query) # Display results print("Answer:") print(result["answer"]) print(f"\nSource: {result['source']}") if result["source_documents"]: print("\nSource Documents:") for i, doc in enumerate(result["source_documents"]): print(f"Source {i+1}:") print(doc.page_content[:200] + "...") # Print first 200 chars of each source print()
Wax-soo-saarkayga Tallaabo 8 waa hoos Qaabku wuxuu isticmaalaa RAG, kaas oo tilmaamaya dukumeenti la mid ah dukaanka FAISS oo ka jawaabaya su'aashayda.
Gebogebadii, ka faa'iidaysiga RAG ee URL-yada waa hab awood leh oo loogu talagalay soo-celinta aqoonta la xoojiyey ee su'aalaha la xiriira sonkorowga. Marka la isku daro fikradaha dhabta ah ee aduunka ee laga helo goobaha bulshada sida Reddit ee dusha sare ee LLM-yada furan, waxaan ku siin karnaa macluumaad gaar ah oo sax ah - ka dib, yaa u fahma si ka fiican kuwa la nool maalin kasta?
Habkani kaliya maaha mid wax ku ool ah laakiin sidoo kale wuxuu kobciyaa iskaashiga, ugu dambeyntii hagaajinta taageerada shakhsiyaadka maareynaya sonkorowga. Maaddaama AI ay sii waddo kobcinta, awooddeeda ay ku hagaajinayso daryeelka caafimaadka iyo fayo-qabka ayaa weli ah mid ballaaran.
Sawirka ay leedahay Suzy Hazelwood: https://www.pexels.com/photo/close-up-photo-of-sugar-cubes-in-glass-jar-2523650/