paint-brush
Ulwazi oluSisiseko lwePython lunokukunceda ukuba uzenzele iModeli yakho yokuFunda ngomatshininge@buzzpy
20,902 ukufunda
20,902 ukufunda

Ulwazi oluSisiseko lwePython lunokukunceda ukuba uzenzele iModeli yakho yokuFunda ngomatshini

nge Chenuli J.13m2025/01/25
Read on Terminal Reader

Inde kakhulu; Ukufunda

Masizenzele eyethu iModeli yokuFunda ngoomatshini ngeTensorflow, ithala leencwadi lePython.
featured image - Ulwazi oluSisiseko lwePython lunokukunceda ukuba uzenzele iModeli yakho yokuFunda ngomatshini
Chenuli J. HackerNoon profile picture
0-item
1-item


Molweni bantu bahle! Ndiyathemba ukuba u-2025 ukuphethe kakuhle, nangona ibiyi-buggy kum, ukuza kuthi ga ngoku.


Wamkelekile kwiblogi yeeDoodles kunye neNkqubo kwaye namhlanje siza kwakha: Imodeli yohlalutyo lweemvakalelo usebenzisa iTensorFlow + Python.



Kule tutorial, siya kufunda kwakhona malunga neziseko zokuFunda koMatshini ngePython, kwaye njengoko bekutshiwo ngaphambili, siya kukwazi ukuzakhela eyethu iModeli yokuFunda ngoMatshini ngeTensorflow, ithala leencwadi lePython. Lo mzekelo uya kukwazi ukubona ithoni / uvakalelo lwesicatshulwa esifakiweyo , ngokufunda kunye nokufunda kwisampula yedatha enikiweyo.

Izinto ezifunekayo kuqala

Konke esikufunayo lulunye ulwazi lwePython (ezona zinto zisisiseko, kunjalo), kwaye kwakhona, qiniseka ukuba unePython efakwe kwinkqubo yakho (Python 3.9+ iyanconywa).


Nangona kunjalo, ukuba ukufumanisa kunzima ukuya kwesi sifundo, ungakhathazeki! Ndidubule i-imeyile okanye umyalezo; Ndiza kubuyela kuwe ASAP.


Ukuba awuyazi, yintoni i-Machine Learning?

Ngamagama alula, i-Machine Learning (ML) yenza ikhompyutha ifunde kwaye iqikelele, ngokufunda idatha kunye nezibalo. Ngokufunda idatha enikiweyo, ikhomputha inokuchonga kwaye ikhuphe iipateni kwaye emva koko yenze uqikelelo olusekwe kuzo. Ukuchongwa kwe-Imeyili ye-Spam, ukuqaphela intetho, kunye nokuqikelelwa kwe-traffic ezinye iimeko zokusetyenziswa kwangempela kokuFunda koMatshini.


Umzekelo ongcono, khawufane ucinge ukuba sifuna ukufundisa ikhompyuter ukubona iikati kwiifoto. Ungayibonisa imifanekiso emininzi yeekati kwaye uthi, "Heyi, ikhompyuter, ziikati ezi!" Ikhompyuter ijonga imifanekiso kwaye iqala ukuqaphela iipateni - ezinjengeendlebe ezitsolo, amabhovu kunye noboya. Emva kokubona imizekelo eyaneleyo, inokubona ikati kwifoto entsha engazange yayibona ngaphambili.


Enye inkqubo enjalo esiyisebenzisayo yonke imihla zizihluzo ze-spam ze-imeyile. Lo mfanekiso ulandelayo ubonisa ukuba kwenziwa njani.


Iimodeli zeML zifumanisa njani ugaxekile kunye nee-imeyile ze-ham ngokufunda idatha kwiiseti zedatha yesampula.


Kutheni usebenzisa iPython?

Nangona iPython Programming Language ingakhelwanga ngokukodwa iML okanye iSayensi yeDatha, ithathwa njengolwimi olukhulu lokucwangcisa lweML ngenxa yokuguquguquka kwayo. Ngamakhulu amathala eencwadi afumanekayo ukuze akhutshelwe simahla, nabani na unokwakha iimodeli zeML ngokulula ngokusebenzisa ithala leencwadi elakhiwe ngaphambili ngaphandle kwesidingo sokucwangcisa inkqubo epheleleyo ukusuka ekuqaleni.


I-TensorFlow lelinye lamathala eencwadi elakhiwe nguGoogle wokuFunda ngoomatshini kunye nobukrelekrele bobuGcisa. I-TensorFlow isoloko isetyenziswa zizazinzulu zedatha, iinjineli zedatha, kunye nabanye abaphuhlisi ukwakha imodeli yokufunda koomatshini ngokulula, njengoko ibandakanya ukufundwa komatshini kunye ne-AI algorithms.


Ndwendwela iWebhusayithi esemthethweni yeTensorFlow


Ukuhlohla


Ukufakela iTensorflow, sebenzisa lo myalelo ulandelayo kwi-terminal yakho:

 pip install tensorflow

Kwaye ukufaka iiPanda kunye neNumpy,

 pip install pandas numpy

Nceda ukhuphele isampula yefayile ye-CSV kule ndawo yogcino: I-Github Repository-TensorFlow ML Model

Ukuqonda idatha

#1 umthetho wohlalutyo lwedatha kunye nantoni na eyakhelwe kwidatha: Qonda idatha onayo kuqala.


Kule meko, i-dataset ineekholamu ezimbini: Umbhalo kunye neSentiment. Nangona ikholamu "yombhalo" ineenkcazo ezahlukeneyo ezenziwe kwiimuvi, iincwadi, njl njl, ikholomu "yeemvakalelo" ibonisa ukuba isicatshulwa silungile, singathathi hlangothi, okanye sibi, sisebenzisa amanani 1, 2, kunye no-0 ngokulandelanayo.

Ukulungiswa kweDatha


Umgaqo olandelayo wesithupha kukucoca ukuphinda-phinda kwaye ususe amaxabiso angenanto kwidatha yakho yesampulu. Kodwa kulo mzekelo, ekubeni isethi yedatha enikiweyo incinci kwaye ayiqulathanga impinda okanye amaxabiso angasebenziyo, sinokutsiba inkqubo yokucoca idatha.



Ukuqala ukwakha imodeli, kufuneka siqokelele kwaye silungiselele i-dataset ukuqeqesha imodeli yokuhlalutya imvakalelo. I-Pandas, ilayibrari eyaziwayo yokuhlalutya idatha kunye nokukhwabanisa, ingasetyenziselwa lo msebenzi.

 import pandas as pd # Load data from CSV data = pd.read_csv('sentiment.csv') # Change the path of downloaded CSV File accordingly # Text data and labels texts = data['text'].tolist() labels = data['sentiment'].values

Ikhowudi engentla iguqula ifayile ye-CSV kwisakhelo sedatha, isebenzisa pandas.read_csv() umsebenzi. Okulandelayo, inika amaxabiso oluhlu "lweemvakalelo" kuluhlu lwePython usebenzisa i tolist() umsebenzi kwaye yenza uluhlu lweNumpy ngamaxabiso.


Kutheni usebenzisa uluhlu lweNumpy?

Njengoko usenokuba sele uyazi, i-Numpy yenzelwe ukwenziwa kwedatha. I-Numpy arrays iphatha ngokufanelekileyo iilebhile zamanani zemisebenzi yokufunda yomatshini, enikezela ukuguquguquka kwintlangano yedatha. Yiyo loo nto sisebenzisa iNumpy kule meko.


Kusetyenzwa Isiqendu

Emva kokulungiselela idatha yesampulu, kufuneka siphinde sisebenzise iSibhalo, esibandakanya iTokenization.


I-Tokenization yinkqubo yokwahlula isampula yesicatshulwa ngasinye kumagama okanye iimpawu, ukuze, sikwazi ukuguqula idatha yesicatshulwa ekrwada kwifomati enokuthi iqhutywe yimodeli , ivumela ukuba iqonde kwaye ifunde kumagama ngamnye kwiisampuli zesicatshulwa. .


Jonga kulo mfanekiso ungezantsi ukuze ufunde indlela i-tokenization isebenza ngayo.


Indlela iTokenization isebenza ngayo kunye neempazamo ezinokwenzeka.


Kule projekthi, kungcono ukusebenzisa i-tokenization ye-manual endaweni yezinye ii-tokenizers ezakhelwe ngaphambili njengoko isinika ulawulo olungaphezulu lwenkqubo ye-tokenization, iqinisekisa ukuhambelana nefomathi yedatha ethile, kwaye ivumela amanyathelo alungiselelwe kwangaphambili.


Qaphela: Kwi-Manual Tokenization, sibhala ikhowudi yokwahlula isicatshulwa kumagama, okwenziwa ngokwezifiso kakhulu ngokweemfuno zeprojekthi. Nangona kunjalo, ezinye iindlela, ezinjengokusebenzisa i-TensorFlow Keras Tokenizer, yiza nezixhobo esele zilungile kunye nemisebenzi yokwahlula isicatshulwa ngokuzenzekelayo, ekulula ukuyiphumeza kodwa ingenziwa ngokwezifiso.


Okulandelayo yikhowudi yesnippet esinokuyisebenzisa kwi-tokenization yedatha yesampuli.

 word_index = {} sequences = [] for text in texts: words = text.lower().split() sequence = [] for word in words: if word not in word_index: word_index[word] = len(word_index) + 1 sequence.append(word_index[word]) sequences.append(sequence)

Kule khowudi ingentla,

  • word_index : Isichazi-magama esingenanto esenzelwe ukugcina igama ngalinye elilodwa kwidataset, kunye nexabiso lalo.
  • sequences : Uluhlu olungenanto olugcina ulandelelwano lokumelwa kwamanani amagama kwisampulu yokubhaliweyo nganye.
  • for text in texts : vula kwisampulu yombhalo ngamnye kuluhlu “lwemibhalo” (eyenziwe ngaphambili).
  • words = text.lower().split() : Iguqula isampuli nganye yombhalo ibe ngoonobumba abancinci kwaye iyahlulahlulwe ibe ngamagama ngamanye, ngokusekelwe kwisithuba esimhlophe.
  • for word in words : I-loop eneeded ephindaphinda ngaphezu kwegama ngalinye kuluhlu “lwamagama”, oluqulathe amagama anophawu olusuka kwiisampulu zeteksti zangoku.
  • if word not in word_index : Ukuba igama alikho ngoku kwidikshinari yesalathisi, yongezwa kuyo kunye nesalathiso esisodwa, esifunyanwa ngokudibanisa u-1 kubude bangoku besichazi-magama.
  • sequence. append (word_index[word]) : Emva kokumisela isalathiso segama langoku, lihlonyelwa kuluhlu “lolandelelwano”. Oku kuguqula igama ngalinye kwisampulu yokubhaliweyo ukuya kwisalathiso esihambelanayo ngokusekwe kwisichazi-magama "word_index".
  • sequence.append(sequence) : Emva kokuba onke amagama akwisampulu yombhalo aguqulelwe kwizalathisi zamanani kwaye agcinwe kuluhlu “lokulandelelana”, olu luhlu luhlonyelwa kuluhlu “lokulandelelana”.


Isishwankathelo, le khowudi ingentla ibonakalisa idatha yokubhaliweyo ngokuguqula igama ngalinye kumelo lwalo lwamanani ngokusekelwe kwisichazi-magama word_index , ebonisa amagama kwiimpawu ezizodwa. Idala ulandelelwano lokumelwa kwamanani kwisampulu yombhalo ngamnye, enokusetyenziswa njengedatha yokufaka imodeli.

Uyilo lweModeli

Uyilo lwemodeli ethile lulungiselelo lweeleya, iikhomponenti, kunye noqhagamshelwano olumisela indlela idatha ehamba ngayo kuyo . Uyilo lwemodeli inefuthe elibonakalayo kwisantya soqeqesho lwemodeli, ukusebenza, kunye nokukwazi ukwenza ngokubanzi.


Emva kokucubungula idatha yegalelo, sinokuchaza uyilo lwemodeli njengakumzekelo ungezantsi:

 model = tf.keras.Sequential([ tf.keras.layers.Embedding(len(word_index) + 1, 16, input_length=max_length), tf.keras.layers.LSTM(64), tf.keras.layers.Dense(3, activation='softmax') ])


Kule khowudi ingentla, sisebenzisa i-TensorFlow Keras ekumgangatho ophezulu wothungelwano lwe-neural i-API eyakhelwe ukulinga ngokukhawuleza kunye neprototyping yeemodeli zokuFunda okuNzululwazi, ngokwenza lula inkqubo yokwakha kunye nokuqulunqa iimodeli zokufunda koomatshini.


  • tf. keras.Sequential() : Ichaza imodeli elandelelanayo, esisipakisho somgca somaleko. Idatha ihamba ukusuka kwinqanaba lokuqala ukuya kokugqibela, ngokulandelelana.
  • tf.keras.layers.Embedding(len(word_index) + 1, 16, input_length=max_length) : Lo maleko usetyenziselwa ukufakela amagama, okuguqula amagama abe ziivektha ezishinyeneyo zobukhulu obusisigxina. I-len(igama_isalathisi) + 1 ixela ubungakanani besigama, i-16 yi-dimensionality yokufakela, kwaye input_length=max_length ibeka ubude begalelo kulandelelwano ngalunye.
  • tf.keras.layers.LSTM(64) : Lo maleko ngumaleko weMemori yeXesha eliFutshane ende (LSTM), luhlobo lwenethiwekhi ye-neural eqhubekayo (RNN). Iqhuba ulandelelwano lokufakela amagama kwaye "inokukhumbula" iipateni ezibalulekileyo okanye ukuxhomekeka kwidatha. Ineeyunithi ezingama-64, ezimisela ubungakanani bendawo yokuphuma.
  • tf.keras.layers.Dense(3, activation='softmax') : Lo ngumaleko odityanisiweyo oxineneyo oneeyunithi ezi-3 kunye nomsebenzi wokuvula i-softmax. Ngumaleko wemveliso wemodeli, uvelisa unikezelo olunokwenzeka ngaphezulu kweeklasi ezintathu ezinokwenzeka (kucingelwa ingxaki yokuhlelwa kodidi oluninzi).


Umgca wokugqibela wekhowudi unoxanduva lwenkqubo engentla

Ukuhlanganisa

Kwi-Machine Learning nge-TensorFlow, ukuhlanganiswa kubhekiselele kwinkqubo yokuqwalasela imodeli yoqeqesho ngokucacisa amacandelo amathathu abalulekileyo - Umsebenzi wokuLahleka, i-Optimizer, kunye neeMetrics.


  1. Umsebenzi woLahleko : Ilinganisa impazamo phakathi koqikelelo lwemodeli kunye neethagethi zokwenyani, isikhokelo sohlengahlengiso lohlengahlengiso.



  2. I-Optimizer : Ukulungelelanisa iiparamitha zemodeli ukunciphisa umsebenzi welahleko, uvumela ukufunda okusebenzayo.

  3. Iimetriki : Ibonelela ngovavanyo lomsebenzi ngaphaya kwelahleko, njengokuchaneka okanye ukuchaneka, ukuncedisa kuvavanyo lwemodeli.


Le khowudi ingezantsi ingasetyenziselwa ukuqulunqa iModeli yoHlalutyo lweSivakalelo:

 model.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

Apha,

  • loss='sparse_categorical_crossentropy' : Umsebenzi welahleko udla ngokusetyenziselwa uHlelo lwemisebenzi nokuba iilebhile ekujoliswe kuzo ziziinombolo ezipheleleyo kwaye imveliso yomfuziselo lunikezelo olunokwenzeka kwiiklasi ezininzi. Ilinganisa umahluko phakathi kweelebhile zokwenyani kunye noqikelelo , ngenjongo yokuyinciphisa ngexesha loqeqesho.

  • optimizer='adam' : UAdam yi-algorithm yokuphucula ehlengahlengisa izinga lokufunda ngexesha loqeqesho. Isetyenziswa ngokubanzi ekusebenzeni ngenxa yokusebenza kwayo kakuhle, ukomelela, kunye nokusebenza kuluhlu olubanzi lwemisebenzi xa kuthelekiswa nezinye izilungisi.

  • metrics = ['accuracy'] : Ukuchaneka sisilinganiselo esiqhelekileyo esisoloko sisetyenziselwa ukuvavanya imifuziselo yohlelo. Ibonelela ngomlinganiselo othe ngqo wentsebenzo yomzekelo iyonke kumsebenzi, njengepesenti yeesampulu apho uqikelelo lomfuziselo ludibana neelebhile eziyinyani.



Ukuqeqesha uMzekelo

Ngoku ukuba idatha yegalelo icutshungulwa kwaye ilungile kwaye imodeli yoyilo nayo ichazwe, sinokuqeqesha imodeli ngokusebenzisa model.fit() indlela.


 model.fit(padded_sequences, labels, epochs=15, verbose=1)
  • padded_sequences : Idatha yegalelo yokuqeqesha imodeli, equkethe ukulandelelana kwemilinganiselo efanayo (i-padding iya kuxoxwa kamva kwisifundo).

  • labels : Iileyibhile ekujoliswe kuzo ezihambelana nedatha yegalelo (okt iindidi zeemvakalelo ezibekelwe isampulu yombhalo ngamnye)

  • epochs=15 : Ixesha lexesha kukupasa okupheleleyo kwidathasethi yoqeqesho epheleleyo ngexesha lenkqubo yoqeqesho. Ngokunjalo, kule nkqubo, imodeli iphinda-phinda kwidathasethi epheleleyo amaxesha ali-15 ngexesha loqeqesho.


Xa inani leepochs linyusiwe, liya kubanakho ukuphucula ukusebenza njengoko lifunda iipatheni ezinzima ngakumbi ngeesampuli zedatha. Nangona kunjalo, ukuba amaxesha amaninzi kakhulu asetyenziswayo, imodeli inokunkqaya idatha yoqeqesho ekhokelela (ebizwa ngokuba "yi-overfitting") ekuguqulweni kakubi kwedatha entsha. Ixesha elisetyenzisiweyo loqeqesho liya kwanda ngokwanda kwenani leepochs kwaye ngokuphambanayo.


What is epoch and How to choose the correct number of epoch | by Upendra  Vijay | Medium


  • verbose=1 : Le yiparameter yokulawula ukuba ingakanani imveliso eveliswa yindlela efanelekileyo yomfuziselo ngexesha loqeqesho. Ixabiso le-1 lithetha ukuba imivalo yenkqubela phambili iya kuboniswa kwi-console njengoko imodeli iqeqesha, i-0 ithetha ukuba akukho mveliso, kwaye i-2 ithetha umgca omnye kwixesha ngalinye. Kuba kuya kuba kuhle ukubona ukuchaneka kunye nelahleko yamaxabiso kunye nexesha elithathiweyo kwi-epoch nganye, sizakuyibeka ku-1.


Ukwenza Uqikelelo

Emva kokuhlanganiswa kunye noqeqesho lwemodeli, inokwenza uqikelelo olusekwe kwisampulu yedatha yethu, ngokulula ngokusebenzisa ipredi () umsebenzi. Nangona kunjalo, kufuneka sifake idatha yegalelo ukuze sivavanye imodeli kwaye sifumane iziphumo. Ukwenza njalo, kufuneka sifake ezinye iingxelo zeteksti kwaye emva koko sicele imodeli ukuba iqikelele imvakalelo yedatha yegalelo.


 test_texts = ["The price was too high for the quality", "The interface is user-friendly", "I'm satisfied"] test_sequences = [] for text in test_texts: words = text.lower().split() sequence = [] for word in words: if word in word_index: sequence.append(word_index[word]) test_sequences.append(sequence)

Apha, test_texts igcina idatha yegalelo ngelixa uluhlu lwe test_sequences lusetyenziselwa ukugcina idatha yovavanyo lwe-tokenized, engamagama ahlulwe ngezithuba ezimhlophe emva kokujika abe ngoonobumba abancinci. Kodwa kunjalo, test_sequences ayizukwazi ukwenza njenge data yegalelo lemodeli.


Isizathu sesokuba uninzi lwezikhokelo zokufunda ezinzulu, kubandakanywa i-Tensorflow, ngokuqhelekileyo zifuna idatha yegalelo ukuba ibe nomlinganiselo ofanayo (oko kuthetha ukuba ubude balo lonke ulandelelwano kufuneka bulingane), ukusetyenzwa kweebhetshi zedatha ngokufanelekileyo. Ukufezekisa oku, ungasebenzisa ubuchule obufana ne-padding, apho ulandelelwano lwandisiwe ukuze luhambelane nobude bexesha elide kakhulu kwi-dataset, ngokusebenzisa ithokheni ekhethekileyo efana ne- # okanye i-0 (0, kulo mzekelo).

 import numpy as np padded_test_sequences = [] for sequence in test_sequences: padded_sequence = sequence[:max_length] + [0] * (max_length - len(sequence)) padded_test_sequences.append(padded_sequence) # Convert to numpy array padded_test_sequences = np.array(padded_test_sequences)

Kwikhowudi enikiweyo,

  • padded_test_sequences : Uluhlu olungenanto lokugcina ulandelelwano lwepadded oluza kusetyenziswa ukuvavanya imodeli.
  • for sequence in sequences : Lowusha kulandelelwano ngalunye kuluhlu “lolandelelwano”.
  • padded_sequence : Yenza ulandelelwano olutsha lwepadded kulandelelwano ngalunye, icuthe ulandelelwano loqobo kwizinto zokuqala zobude obukhulu ukuqinisekisa ukuhambelana. Emva koko, sipakisha ulandelelwano kunye no-zero ukutshatisa i-max_length ukuba imfutshane, ngokusebenzayo ukwenza lonke ulandelelwano ubude obufanayo.
  • padded_test_sequences.append() : Yongeza ulandelelwano lwepadded kuluhlu oluza kusetyenziselwa uvavanyo.
  • padded_sequences = np.array() : Ukuguqula uluhlu lolandelelwano olukhuselweyo lube luluhlu lweNumpy.


Ngoku, ekubeni i-input data ikulungele ukusetyenziswa, imodeli inokugqiba ukuqikelela ukuvakalelwa kweetekisi zokufakelwa.

 predictions = model.predict(padded_test_sequences) # Print predicted sentiments for i, text in enumerate(test_texts): print(f"Text: {text}, Predicted Sentiment: {np.argmax(predictions[i])}")

Kule khowudi ingentla, i model.predict() indlela yenza uqikelelo kulandelelwano lovavanyo ngalunye, ivelisa uluhlu lwezinto ezinokwenzeka eziqikelelweyo zodidi ngalunye lweemvakalelo. Emva koko iphinda-phinda into nganye test_texts kunye np.argmax(predictions[i]) ibuyisela isalathiso sesona sizathu siphezulu kuluhlu lwamathuba aqikelelweyo kwisampulu yovavanyo lwe-i-th. Esi salathiso sihambelana nodidi oluqikelelweyo lweemvakalelo kunye namathuba aphezulu aqikelelweyo kwisampulu yovavanyo ngalunye, okuthetha ukuba uqikelelo olungcono olwenziweyo lukhutshwe kwaye luboniswe njengesiphumo esiphambili.


Amanqaku awodwa *:* np.argmax() ngumsebenzi weNumPy ofumana isalathisi sexabiso eliphezulu kuluhlu. Kulo mongo, np.argmax(predictions[i]) inceda ukumisela udidi lweemvakalelo ngelona liqikelelweyo liphezulu lesampulu yovavanyo ngalunye.


Inkqubo ngoku ikulungele ukusebenza. Emva kokuqulunqa nokuqeqesha imodeli, iModeli yokuFunda koMatshini iya kuprinta izibikezelo zayo zedatha yegalelo.



Kwimveliso yemodeli, sinokubona amaxabiso njengokuthi "Ukuchaneka" kunye "noLahleko" kwi-Epoch nganye. Njengoko bekutshiwo ngaphambili, Ukuchaneka yipesenti yeengqikelelo ezichanekileyo ngaphandle kweengqikelelo zizonke. Ukuchaneka okuphezulu kungcono. Ukuba ukuchaneka ngu-1.0, oku kuthetha ukuba i-100% ithetha ukuba imodeli yenza uqikelelo oluchanekileyo kuzo zonke iimeko. Ngokufanayo, u-0.5 uthetha ukuba imodeli yenze uqikelelo oluchanekileyo lwesiqingatha sexesha, u-0.25 uthetha uqikelelo oluchanekileyo lwekota yexesha, njalo njalo.


Ilahleko , ngakolunye uhlangothi, ibonisa indlela kakubi ngayo ukubikezelwa kwemodeli kuhambelana namaxabiso okwenene. Ixabiso elincinci lelahleko lithetha imodeli engcono enenani elincinci leempazamo, kunye nexabiso elingu-0 lixabiso eligqibeleleyo lelahleko njengoko oko kuthetha ukuba akukho ziphoso zenziwe.


Nangona kunjalo, asikwazi ukumisela ukuchaneka ngokubanzi kunye nokulahlekelwa kwemodeli ngedatha engentla eboniswe kwi-Epoch nganye. Ukwenza njalo, sinokuyivavanya imodeli sisebenzisa indlela yokuvavanya () kwaye siprinte ukuchaneka kwayo kunye nelahleko.

 evaluation = model.evaluate(padded_sequences, labels, verbose=0) # Extract loss and accuracy loss = evaluation[0] accuracy = evaluation[1] # Print loss and accuracy print("Loss:", loss) print("Accuracy:", accuracy)

Isiphumo:

 Loss: 0.6483516097068787 Accuracy: 0.7058823704719543

Ngako oko, kulo mzekelo, ixabiso le-Loss 0.6483 elithetha ukuba iModeli yenze iimpazamo ezithile. Ukuchaneka komzekelo malunga ne-70%, oku kuthetha ukuba izibikezelo ezenziwe ngumzekelo zichanekile ngaphezu kwesiqingatha sexesha. Ngokubanzi, le modeli inokuthathwa njengemodeli "elunge kancinci"; nangona kunjalo, nceda uqaphele ukuba ilahleko "elungileyo" kunye nokuchaneka kwamaxabiso kuxhomekeke kakhulu kuhlobo lwemodeli, ubungakanani bedathasethi, kunye nenjongo yeModeli ethile yokuFunda koMatshini.


Kwaye ewe, sinako kwaye kufuneka siphucule ezi metrics zingentla zemodeli ngeenkqubo zokulungisa kakuhle kunye neeseti zedatha ezingcono. Nangona kunjalo, ngenxa yesi sifundo, masime apha. Ukuba ungathanda inxalenye yesibini yesi sifundo, nceda undazise!

Isishwankathelo

Kule tutorial, sakhe iModeli yokuFunda yeTensorFlow ekwaziyo ukuqikelela imvakalelo yombhalo othile, emva kokuhlalutya isampula yedatha.


IKhowudi epheleleyo kunye neSampuli yeFayile ye-CSV inokukhutshelwa kwaye ibonwe kwindawo yokugcina iGitHub- GitHub-Buzzpy/Tensorflow-ML-Model