




版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進(jìn)行舉報或認(rèn)領(lǐng)
文檔簡介
Prompt Prompt PromptLLMoutputLLMOutputSamplingTop-Kandtop-Top?Ktop?P10PuttingitallPromptingPromptingGeneralprompting/zeroSystem,contextualandroleSystemSystemRoleContextualStep-backChainofThought思維鏈(CoT)ReAct(reason&ReAct(推理與行動)AutomaticPromptCodePromptsforwritingPromptsforexplainingPromptsfortranslatingPromptsfordebuggingandreviewing48BestProvideDesignwithBespecificabouttheUseInstructionsoverControlthemaxtokenControlthemaxtokenUsevariablesinExperimentwithinputformatsandwritingForfew-shotpromptingwithclassificationtasks,mixuptheAdapttomodelExperimentwithoutputExperimentwithoutput JSONJSONWorkingwithExperimenttogetherwithotherpromptCoTBestCoTBestDocumentthevariouspromptPromptPromptPromptPrompt20252February20252FebruaryYoudon’tneedtobeadatascientistoramachinelearningengineer–everyonecanwriteaprompt.
Youdon’tneedtobeadatascientistoramachinelearningengineer–everyonecanwriteaprompt.Whenthinkingaboutalargelanguagemodelinputandoutput,atextprompt(sometimesaccompaniedbyothermodalitiessuchasimageprompts)istheinputthemodelusestopredictaspecificoutput.Youdon’tneedtobeadatascientistoramachinelearningengineer–everyonecanwriteaprompt.However,craftingthemosteffectivepromptcanbecomplicated.Manyaspectsofyourpromptaffectitsefficacy:themodelyouuse,themodel’strainingdata,themodelconfigurations,yourword-choice,styleandtone,structure,andcontextallmatter.Therefore,promptengineeringisaniterativeprocess.Inadequatepromptscanleadtoambiguous,inaccurateresponses,andcanhinderthemodel’sabilitytoprovidemeaningfuloutput.
WhenyouchatwiththeGeminichatbot,1youbasicallywriteprompts,howeverthiswhitepaperfocusesonwritingpromptsfortheGeminimodelwithinVefiexAIorbyusingtheAPI,becausebypromptingthemodeldirectlyyouwillhaveaccesstotheconfigurationsuchastemperatureetc.Thiswhitepaperdiscussespromptengineeringindetail.Wewilllookintothevariouspromptingtechniquestohelpyougettingstafiedandsharetipsandbestpracticestobecomeapromptingexpefi.Wewillalsodiscusssomeofthechallengesyoucanfacewhilecraftingprompts.RememberhowanLLMworks;it’sapredictionengine.Themodeltakessequentialtextasaninputandthenpredictswhatthefollowingtokenshouldbe,basedonthedataitwastrainedon.TheLLMisoperationalizedtodothisoverandoveragain,addingthepreviouslypredictedtokentotheendofthesequentialtextforpredictingthefollowingtoken.Thenexttokenpredictionisbasedontherelationshipbetweenwhat’sintheprevioustokensandwhattheLLMhasseenduringitstraining.Whenyouwriteaprompt,youareattemptingtosetuptheLLMtopredicttherightsequenceoftokens.Promptengineeringistheprocessofdesigninghigh-qualitypromptsthatguideLLMstoproduceaccurateoutputs.Thisprocessinvolvestinkeringtofindthebestprompt,optimizingpromptlength,andevaluatingaprompt’swritingstyleandstructureinrelationtothetask.InthecontextofnaturallanguageprocessingandLLMs,apromptisaninputprovidedtothemodeltogeneratearesponseorprediction.
GeminiVeexAIAPIGemini練數(shù)據(jù)預(yù)測下一個標(biāo)記應(yīng)該是什么。LLM被部署來反復(fù)執(zhí)行此操作,將先前預(yù)測的標(biāo)記添加到LLM設(shè)置正確的序列標(biāo)記。提示工程是設(shè)計高質(zhì)量提示的過程,LLM生成準(zhǔn)確的輸出。這個過程涉及調(diào)整以找到最佳的提示,優(yōu)化提示長度,并評估提LLM的背景下,提示是為模型提供以生Thesepromptscanbeusedtoachievevariouskindsofunderstandingandgenerationtaskssuchastextsummarization,informationextraction,questionandanswering,textclassification,languageorcodetranslation,codegeneration,andcodedocumentationorreasoning.PleasefeelfreetorefertoGoogle’spromptingguides2,3withsimpleandeffectivepromptingexamples.Whenpromptengineering,youwillstafibychoosingamodel.Promptsmightneedtobeoptimizedforyourspecificmodel,regardlessofwhetheryouuseGeminilanguagemodelsinVefiexAI,GPT,Claude,oranopensourcemodellikeGemmaorLLaMA.Besidestheprompt,youwillalsoneedtotinkerwiththevariousconfigurationsofaLLMoutputOnceyouchooseyourmodelyouwillneedtofigureoutthemodelconfiguration.MostLLMscomewithvariousconfigurationoptionsthatcontroltheLLM’soutput.Effectivepromptengineeringrequiressettingtheseconfigurationsoptimallyforyourtask.OutputAnimpofiantconfigurationsettingisthenumberoftokenstogenerateinaresponse.GeneratingmoretokensrequiresmorecomputationfromtheLLM,leadingtohigherenergyconsumption,potentiallyslowerresponsetimes,andhighercosts.
的是VeexAI中的Gemini語言模型、GPT、Claude,還是像Gemma或LLaMA這樣的開源模LLMLLM輸出配置LLM都提供了各種配置選項,這些選項LLM的輸出。有效的提示工程需要根據(jù)您的任務(wù)最優(yōu)地設(shè)置這些配置。LLM進(jìn)行更多的計算,ReducingtheoutputlengthoftheLLMdoesn’tcausetheLLMtobecomemorestylisticallyortextuallysuccinctintheoutputitcreates,itjustcausestheLLMtostoppredictingmoretokensoncethelimitisreached.Ifyourneedsrequireashofioutputlength,you’llalsopossiblyneedtoengineeryourprompttoaccommodate.OutputlengthrestrictionisespeciallyimpofiantforsomeLLMpromptingtechniques,likeReAct,wheretheLLMwillkeepemittinguselesstokensaftertheresponseyouwant.Beaware,generatingmoretokensrequiresmorecomputationfromtheLLM,leadingtohigherenergyconsumptionandpotentiallyslowerresponsetimes,whichleadstohighercosts.SamplingLLMsdonotformallypredictasingletoken.Rather,LLMspredictprobabilitiesforwhatthenexttokencouldbe,witheachtokenintheLLM’svocabularygettingaprobability.Thosetokenprobabilitiesarethensampledtodeterminewhatthenextproducedtokenwillbe.Temperature,top-K,andtop-Parethemostcommonconfigurationsettingsthatdeterminehowpredictedtokenprobabilitiesareprocessedtochooseasingleoutputtoken.Temperaturecontrolsthedegreeofrandomnessintokenselection.Lowertemperaturesaregoodforpromptsthatexpectamoredeterministicresponse,whilehighertemperaturescanleadtomorediverseorunexpectedresults.Atemperatureof0(greedydecoding)is
LLM的輸出長度并不會使其生成的輸出在風(fēng)格或文本上變得更加簡潔,它只是導(dǎo)致LLM在達(dá)到限制后停止預(yù)測更多標(biāo)記。如果你的需求需要較短的輸出長度,你可能還需要設(shè)LLM進(jìn)行更多計算,這會導(dǎo)致更高的能耗和可能更慢的響LLMsLLMsLLMtop?Ktop?P溫 0(貪婪解碼)是deterministic:thehighestprobabilitytokenisalwaysselected(thoughnotethatiftwotokenshavethesamehighestpredictedprobability,dependingonhowtiebreakingisimplementedyoumaynotalwaysgetthesameoutputwithtemperature0).Temperaturesclosetothemaxtendtocreatemorerandomoutput.Andastemperaturegetshigherandhigher,alltokensbecomeequallylikelytobethenextpredictedtoken.TheGeminitemperaturecontrolcanbeunderstoodinasimilarwaytothesoftmaxfunctionusedinmachinelearning.Alowtemperaturesettingmirrorsalowsoftmaxtemperature(T),emphasizingasingle,preferredtemperaturewithhighcefiainty.AhigherGeminitemperaturesettingislikeahighsoftmaxtemperature,makingawiderrangeoftemperaturesaroundtheselectedsettingmoreacceptable.Thisincreaseduncefiaintyaccommodatesscenarioswherearigid,precisetemperaturemaynotbeessentiallikeforexamplewhenexperimentingwithcreativeoutputs.Top-Kandtop-P(alsoknownasnucleussampling)4aretwosamplingsettingsusedinLLMstorestrictthepredictednexttokentocomefromtokenswiththetoppredictedprobabilities.Liketemperature,thesesamplingsettingscontroltherandomnessanddiversityofgeneratedtext.Top-KsamplingselectsthetopKmostlikelytokensfromthemodel’spredicteddistribution.Thehighertop-K,themorecreativeandvariedthemodel’soutput;thelowertop-K,themorerestiveandfactualthemodel’soutput.Atop-Kof1isequivalenttogreedydecoding.
0)。Geminisomaxsomax溫度(T),GeminisomaxTop?Ktop?P(也稱為核采樣)是兩種在大型語言模型(LLM)中使用的采樣設(shè)置,用于限Top?KTop?K個標(biāo)記。Top?K越高,模型的輸出越具創(chuàng)造性和多樣性;Top?K越低,模型的輸出越保守和事實(shí)性。Top?K1Top-Psamplingselectsthetoptokenswhosecumulativeprobabilitydoesnotexceedacefiainvalue(P).ValuesforPrangefrom0(greedydecoding)to1(alltokensintheLLM’svocabulary).bothtogether)andseewhichoneproducestheresultsyouarelookingfor.PuflingitallChoosingbetweentop-K,top-P,temperature,andthenumberoftokenstogenerate,dependsonthespecificapplicationanddesiredoutcome,andthesettingsallimpactoneanother.It’salsoimpofianttomakesureyouunderstandhowyourchosenmodelcombinesthedifferentsamplingsettingstogether.Iftemperature,top-K,andtop-Pareallavailable(asinVefiexStudio),tokensthatmeetboththetop-Kandtop-Pcriteriaarecandidatesforthenextpredictedtoken,andtemperatureisappliedtosamplefromthetokensthatpassedthetop-Kandtop-Pcriteria.Ifonlytop-Kortop-Pisavailable,thebehavioristhesamebutonlytheonetop-KorPsettingisused.Iftemperatureisnotavailable,whatevertokensmeetthetop-Kand/ortop-Pcriteriaarethenrandomlyselectedfromtoproduceasinglenextpredictedtoken.Atextremesettingsofonesamplingconfigurationvalue,thatonesamplingsettingeithercancelsoutotherconfigurationsettingsorbecomesirrelevant.
Top?P(P)的前綴標(biāo)記。P0(貪婪解碼)1(LLM詞匯表中的所有標(biāo)記)。top?Ktop?P(或同時使用),看看哪一種能產(chǎn)生你想要的top?Ktop?Ptoken如果溫度、top?K和top?P都可用(如在VeexStudio中),則滿足top?K和top?P兩個條件如果溫度不可用,則滿足top?K和/或top?P條件的token將隨機(jī)選擇以生成單個下一個預(yù)測tokenIfyousettemperatureto0,top-Kandtop-Pbecomeirrelevant–themostprobabletokenbecomesthenexttokenpredicted.Ifyousettemperatureextremelyhigh(above1–generallyintothe10s),temperaturebecomesirrelevantandwhatevertokensmakeitthroughthetop-Kand/ortop-Pcriteriaarethenrandomlysampledtochooseanextpredictedtoken.Ifyousettop-Kto1,temperatureandtop-Pbecomeirrelevant.Onlyonetokenpassesthetop-Kcriteria,andthattokenisthenextpredictedtoken.Ifyousettop-Kextremelyhigh,liketothesizeoftheLLM’svocabulary,anytokenwithanonzeroprobabilityofbeingthenexttokenwillmeetthetop-Kcriteriaandnoneareselectedout.Ifyousettop-Pto0(oraverysmallvalue),mostLLMsamplingimplementationswillthenonlyconsiderthemostprobabletokentomeetthetop-Pcriteria,makingtemperatureandtop-Kirrelevant.Ifyousettop-Pto1,anytokenwithanonzeroprobabilityofbeingthenexttokenwillmeetthetop-Pcriteria,andnoneareselectedout.Asageneralstafiingpoint,atemperatureof.2,top-Pof.95,andtop-Kof30willgiveyourelativelycoherentresultsthatcanbecreativebutnotexcessivelyso.Ifyouwantespeciallywantlesscreativeresults,trystafiingwithatemperatureof.1,top-Pof.9,andtop-Kof20.Finally,ifyourtaskalwayshasasinglecorrectanswer(e.g.,answeringamathproblem),stafiwithatemperatureof0.NOTE:Withmorefreedom(highertemperature,top-K,top-P,andoutputtokens),theLLMmightgeneratetextthatislessrelevant.WARNING:Haveyoueverseenaresponseendingwithalargeamountoffillerwords?Thisisalsoknownasthe"repetitionloopbug",whichisacommonissueinLargeLanguageModelswherethemodelgetsstuckinacycle,repeatedlygeneratingthesame(filler)word,phrase,orsentencestructure,oftenexacerbatedbyinappropriatetemperatureandtop-k/
如果你將溫度設(shè)置為0,top?K和top?P就變得無關(guān)緊要了——最可能的標(biāo)記將成為下一個top?Ktop?Ptop?K1top?Ptop?K標(biāo)準(zhǔn),top?KLLM詞匯表的top?Ktop?Ptop?Ktop?P1,任何有非零top?P0.2,top?P0.95,top?K300.9,top?P0.99,top?K400.1,top?P0.9,top?K20。最后,如果你的任務(wù)總是只有一個正確答案(例如,解答數(shù)學(xué)問題),則0注意:隨著自由度的增加(更高的溫度、top?K、top?P和輸出令牌),LLM的文字。top?ktop-psettings.Thiscanoccuratbothlowandhightemperaturesettings,thoughfordifferentreasons.Atlowtemperatures,themodelbecomesoverlydeterministic,stickingrigidlytothehighestprobabilitypath,whichcanleadtoaloopifthatpathrevisitspreviouslygeneratedtext.Conversely,athightemperatures,themodel'soutputbecomesexcessivelyrandom,increasingtheprobabilitythatarandomlychosenwordorphrasewill,bychance,leadbacktoapriorstate,creatingaloopduetothevastnumberofavailableoptions.Inbothcases,themodel'ssamplingprocessgets"stuck,"resultinginmonotonousandunhelpfuloutputuntiltheoutputwindowisfilled.Solvingthisoftenrequirescarefultinkeringwithtemperatureandtop-k/top-pvaluestofindtheoptimalbalancebetweendeterminismandrandomness.PromptingLLMsaretunedtofollowinstructionsandaretrainedonlargeamountsofdatasotheycanunderstandapromptandgenerateananswer.ButLLMsaren’tperfect;thecleareryourprompttext,thebetteritisfortheLLMtopredictthenextlikelytext.Additionally,specifictechniquesthattakeadvantageofhowLLMsaretrainedandhowLLMsworkwillhelpyougettherelevantresultsfromLLMsNowthatweunderstandwhatpromptengineeringisandwhatittakes,let’sdiveintosomeexamplesofthemostimpofiantpromptingtechniques.Generalprompting/zeroAzero-shot5promptisthesimplesttypeofprompt.ItonlyprovidesadescriptionofataskandsometextfortheLLMtogetstafiedwith.Thisinputcouldbeanything:aquestion,astafiofastory,orinstructions.Thenamezero-shotstandsfor’noexamples’.
k/top?pLLMLLMLLMLLMLLMLLM獲取相關(guān)結(jié)果。zero-shot5提示是最簡單類型的提示。它只提供任務(wù)的描述和一些文本,讓LLMLet’suseVefiexAIStudio(forLanguage)inVefiexAI,6whichprovidesaplaygroundtotestprompts.InTable1,youwillseeanexamplezero-shotprompttoclassifymoviereviews.Thetableformatasusedbelowisagreatwayofdocumentingprompts.Yourpromptswilllikelygothroughmanyiterationsbeforetheyendupinacodebase,soit’simpofianttokeeptrackofyourpromptengineeringworkinadisciplined,structuredway.Moreonthistableformat,theimpofianceoftrackingpromptengineeringwork,andthepromptdevelopmentprocessisintheBestPracticessectionlaterinthischapter(“DocumentthevariouspromptThemodeltemperatureshouldbesettoalownumber,sincenocreativityisneeded,andweusethegemini-prodefaulttop-Kandtop-Pvalues,whicheffectivelydisablebothsettings(see‘LLMOutputConfiguration’above).Payattentiontothegeneratedoutput.Thewordsdisturbingandmasterpieceshouldmakethepredictionalittlemorecomplicated,asbothwordsareusedinthesamesentence.
VeexAI(語言版)VeexAIStudio1程工作的重要性以及提示開發(fā)過程,請參閱本章后面的最佳實(shí)踐部分()gemini?protop?Ktop?P(LLM)。請注意生成的輸disturbingmasterpiece應(yīng)使預(yù)測稍微復(fù)雜一些,因?yàn)檫@兩個單詞都出現(xiàn)在同一個 ClassifymoviereviewsasPOSITIVE,NEUTRALorNEGATIVE.Review:"Her"isadisturbingstudyrevealingthedirectionhumanityisheadedifAIisallowedtokeepevolving,unchecked.Iwishthereweremoremovieslikethismasterpiece.Top-Top-gemini-Classifymoviereviewsaspositive,neutralorClassifymoviereviewsasPOSITIVE,NEUTRALorNEGATIVE.Review:"Her"isadisturbingstudyrevealingthedirectionhumanityisheadedifAIisallowedtokeepevolvingunchecked.Iwishthereweremoremovieslikethismasterpiece.Top-Top-Tokenngemini-Table1.Anexampleofzero-shotWhenzero-shotdoesn’twork,youcanprovidedemonstrationsorexamplesintheprompt,whichleadsto“one-shot”and“few-shot”prompting.Generalprompting/zeroshotOne-shot&few-WhencreatingpromptsforAImodels,itishelpfultoprovideexamples.Theseexamplescanhelpthemodelunderstandwhatyouareaskingfor.Examplesareespeciallyusefulwhenyouwanttosteerthemodeltoacefiainoutputstructureorpattern.Aone-shotprompt,providesasingleexample,hencethenameone-shot.Theideaisthemodelhasanexampleitcanimitatetobestcompletethetask.Afew-shotprompt7providesmultipleexamplestothemodel.Thisapproachshowsthemodelapatternthatitneedstofollow.Theideaissimilartoone-shot,butmultipleexamplesofthedesiredpatternincreasesthechancethemodelfollowsthepattern.
AI單樣本提示提供了一個singlemultiple個示例給模型。這種方法向模型展示了一個它需要遵循的模式。其Thenumberofexamplesyouneedforfew-shotpromptingdependsonafewfactors,includingthecomplexityofthetask,thequalityoftheexamples,andthecapabilitiesofthegenerativeAI(genAI)modelyouareusing.Asageneralruleofthumb,youshoulduseatleastthreetofiveexamplesforfew-shotprompting.However,youmayneedtousemoreexamplesformorecomplextasks,oryoumayneedtousefewerduetotheinputlengthlimitationofyourmodel.Table2showsafew-shotpromptexample,let’susethesamegemini-promodelconfigurationsettingsasbefore,otherthanincreasingthetokenlimittoaccommodatetheneedforalongerresponse.
2gemini?pro模型配置設(shè)置,除了增token ParsepizzaParsepizzaorderstoParseacustomer'spizzaorderintovalidIwantasmallpizzawithcheese,tomatosauce,andpepperoni.JSONResponse:ContinuesnextTop-Top-gemini-Parseacustomer'spizzaorderintovalidIwantasmallpizzawithcheese,tomatosauce,andpepperoni.JSONResponse: Top-Top-Tokenngemini- CanCanIgetalargepizzawithtomatosauce,basilandNow,Iwouldlikealargepizza,withthefirsthalfcheeseandmozzarella.Andtheothertomatosauce,hamandpineapple.JSON"ham","pineapple"]]CanIgetalargepizzawithtomatosauce,basilandNow,Iwouldlikealargepizza,withthefirsthalfcheeseandmozzarella.Andtheothertomatosauce,hamandpineapple.JSON"ham","pineapple"]]Table2.Anexampleoffew-shotWhenyouchooseexamplesforyourprompt,useexamplesthatarerelevanttothetaskyouwanttoperform.Theexamplesshouldbediverse,ofhighquality,andwellwritten.Onesmallmistakecanconfusethemodelandwillresultinundesiredoutput.Ifyouaretryingtogenerateoutputthatisrobusttoavarietyofinputs,thenitisimpofianttoincludeedgecasesinyourexamples.Edgecasesareinputsthatareunusualorunexpected,butthatthemodelshouldstillbeabletohandle.
Table2.Anexampleoffew-shotSystem,contextualandroleSystem,contextualandrolepromptingarealltechniquesusedtoguidehowLLMsgeneratetext,buttheyfocusondifferentaspects:Systempromptingsetstheoverallcontextandpurposeforthelanguagemodel.Itdefinesthe‘bigpicture’ofwhatthemodelshouldbedoing,liketranslatingalanguage,classifyingareviewetc.Contextualpromptingprovidesspecificdetailsorbackgroundinformationrelevanttothecurrentconversationortask.Ithelpsthemodeltounderstandthenuancesofwhat’sbeingaskedandtailortheresponseaccordingly.Rolepromptingassignsaspecificcharacteroridentityforthelanguagemodeltoadopt.associatedknowledgeandbehavior.Therecanbeconsiderableoverlapbetweensystem,contextual,androleprompting.E.g.apromptthatassignsaroletothesystem,canalsohaveacontext.However,eachtypeofpromptservesaslightlydifferentprimarySystemprompt:Definesthemodel’sfundamentalcapabilitiesandoverarchingContextualprompt:Providesimmediate,task-specificinformationtoguidetheresponse.It’shighlyspecifictothecurrenttaskorinput,whichisdynamic.Roleprompt:Framesthemodel’soutputstyleandvoice.Itaddsalayerofspecificityandpersonality.
LLMDistinguishingbetweensystem,contextual,androlepromptsprovidesaframeworkfordesigningpromptswithclearintent,allowingforflexiblecombinationsandmakingiteasiertoanalyzehoweachprompttypeinfluencesthelanguagemodel’soutput.Let’sdiveintothesethreedifferentkindsofSystem Table3containsasystemprompt,whereIspecifyadditionalinformationonhowtoreturntheoutput.Iincreasedthetemperaturetogetahighercreativitylevel,andIspecifiedahighertokenlimit.However,becauseofmyclearinstructiononhowtoreturntheoutputthemodeldidn’treturnextratext.
ClassifymovieClassifymoviereviewsaspositive,neutralorClassifymoviereviewsaspositive,neutralornegative.Onlyreturnthelabelinuppercase.Review:"Her"isadisturbingstudyrevealingthedirectionhumanityisheadedifAIisallowedtokeepevolving,unchecked.It'ssodisturbingIcouldn'twatchit.Top-Top-gemini-Classifymoviereviewsaspositive,neutralornegative.Onlyreturnthelabelinuppercase.Review:"Her"isadisturbingstudyrevealingthedirectionhumanityisheadedifAIisallowedtokeepevolving,unchecked.It'ssodisturbingIcouldn'twatchit.nTop-Top-gemini-Table3.AnexampleofsystemSystempromptscanbeusefulforgeneratingoutputthatmeetsspecificrequirements.Thename‘systemprompt’actuallystandsfor‘providinganadditionaltasktothesystem’.Forexample,youcoulduseasystemprompttogenerateacodesnippetthatiscompatiblewithaspecificprogramminglanguage,oryoucoulduseasystemprompttoreturnacefiainstructure.HavealookintoTable4,whereIreturntheoutputinJSONformat.
4JSONgemini-GeminiTokenTop-Top-Top-Top-nClassifymoviereviewsaspositive,neutralornegative,returnClassifymoviereviewsClassifymoviereviewsaspositive,neutralornegative,returnClassifymoviereviewsaspositive,neutralornegative.ReturnvalidJSON:Review:"Her"isadisturbingstudyrevealingthedirectionhumanityisheadedifAIisallowedtokeepevolving,unchecked.It'ssodisturbingIcouldn'twatchit."name":StringMOVIEJSON"name":"Her"將電影評論分類為正面、中性或負(fù)面,返回JSONClassifymoviereviewsaspositive,neutralornegative.ReturnvalidJSON:Review:"Her"isadisturbingstudyrevealingthedirectionhumanityisheadedifAIisallowedtokeepevolving,unchecked.It'ssodisturbingIcouldn'twatchit."name":StringMOVIEJSON"name":"Her"
4JSONTherearesomebenefitsinreturningJSONobjectsfromapromptthatextractsdata.Inareal-worldapplicationIdon’tneedtomanuallycreatethisJSONformat,Icanreturnthedatainasofiedorder(veryhandywhenworkingwithdatetimeobjects),butmostimpofiantly,bypromptingforaJSONformatitforcesthemodeltocreateastructureandlimithallucinations.Systempromptscanalsobereallyusefulforsafetyandtoxicity.Tocontroltheoutput,simplyaddanadditionallinetoyourpromptlike:‘Youshouldberespectfulinyouranswer.’.RoleRolepromptingisatechniqueinpromptengineeringthatinvolvesassigningaspecificroletothegenAImodel.Thiscanhelpthemodeltogeneratemorerelevantandinformativeoutput,asthemodelcancraftitsresponsestothespecificrolethatithasbeenassigned.Forexample,youcouldrolepromptagenAImodeltobeabookeditor,akindergafienteacher,oramotivationalspeaker.Oncethemodelhasbeenassignedarole,youcanthengiveitpromptsthatarespecifictothatrole.Forexample,youcouldpromptateachertocreatealessonplanthattheycanthenreview.HavealookintoTable5,wherethemodeltakestheroleofatravelguide.
從提取數(shù)據(jù)的提示中返回JSON對象有一些好處。在實(shí)際應(yīng)用中,我無需手動創(chuàng)建這種JSONJSON系統(tǒng)提示對于安全性和毒性也非常有用。為了控制輸出,只需在提示中添加一行,例如:‘人?!?。AIAI5,其中模型扮演了旅行指南的角色。 ActasActastravelguideandprovide3travelRijksmuseum:World-renownedmuseum,knownforDutchmasterpieceslikeRembrandt’s“TheNightWatch.”Don’tmisstheincrediblecollectionofpaintingsandafiifactsshowcasingDutchafiandhistory.VanGoghMuseum:ImmerseyourselfinthecaptivatingworksofVincentvanGogh.Themuseumhousesthelargestcollectionofhispaintingsanddrawings,including“TheStarryNight”and“Sunflowers.”3.StedelijkMuseumAmsterdam:Discovermodernandcontemporaryafifromaroundtheworld.Themuseum’scollectionincludesworksbyPicasso,Kandinsky,andRietveld,housedinastrikingmodernIwantyoutoactasatravelguide.Iwillwritetoyouaboutmylocationandyouwillsuggest3placestovisitnearme.Insomecases,IwillalsogiveyouthetypeofplacesIwillvisit.Mysuggestion:"IaminAmsterdamandIwanttovisitonlymuseums."Top-Top-gemini-Iwantyoutoactasatravelguide.Iwillwritetoyouaboutmylocationandyouwillsuggest3placestovisitnearme.Insomecases,Iwillalsogiveyouthetypeofplaceson:"IaminAmsterdamandIwanttovisitonlymuseums."nTop-Top-TokenGeminiTable5.Anexampleofrole
Theaboveexampleshowsanexampleoftakingtheroleofatravelagent.Whenyouwouldchangetheroletoateacherofgeography,youwouldnoticethatyouwillreceiveadifferentresponse.DefiningaroleperspectiveforanAImodelgivesitablueprintofthetone,style,andfocusedexpefiiseyou’relookingfortoimprovethequality,relevance,andeffectivenessofyour
AIHerearesomestylesyoucanchoosefromwhichIfindConfrontational,Descriptive,Direct,Formal,Humorous,Influential,Informal,Inspirational,PersuasiveLet’schangeourpromptinTable6toincludeahumorousandinspirational
6IwantIwantyoutoactasatravelguide.Iwillwritetoyouaboutmylocationandyouwillsuggest3placestovisitnearmeinahumorousstyle.Mysuggestion:"IaminManhattan."TravelSuggestions:BeholdtheEmpireStateofMind:AscendtothedizzyingheightsoftheEmpireStateBuildingandbaskinthegloryofManhattan'sskyline.PreparetofeellikeKingKongatoptheBigApple,minusthegiantape-sizedbanana.GetAfisy-FafisyatMoMA:UnleashyourinnerafiaficionadoattheMuseumofModernAfi(MoMA).Gazeuponmasterpiecesthatwillboggleyourmindandmakeyouquestionwhetheryourstick-figuredrawingshaveanyafiisticmerit.Shop‘TilYouDroponFifthAvenue:IndulgeinsomeretailtherapyontheiconicFifthAvenue.Braceyourselfforstickershockasyouwindow-shop(oractuallyshop)atdesignerboutiquesthatwillmakeyourwalletcry.Buthey,you’reinManhattan,soyoumightaswellembracethefabulousness!Iwantyoutoactasatravelguide.Iwillwritetoyouaboutmylocationandyouwillsuggest3placestovisitnearmeinahumorousstyle.Mysuggestion:"IaminManhattan."TravelSuggestions:2MoMASy?FaSy現(xiàn)代A(MoMA)。凝視那些讓你瞠目結(jié)舌、讓你質(zhì)疑自己的簡筆畫是否...)標(biāo)志性的第五大道。準(zhǔn)備好迎接令人震驚的標(biāo)價標(biāo)簽,當(dāng)你(或者實(shí)際上)在設(shè)計師精品店櫥窗購物時(或者購物時Table6.AnexampleofrolepromptingwithahumoroustoneandContextual
Byprovidingcontextualprompts,youcanhelpensurethatyourAIinteractionsareasseamlessandefficientaspossible.Themodelwillbeabletomorequicklyunderstandyourrequestandbeabletogeneratemoreaccurateandrelevantresponses,asyoucanseeintheexampleofTable7.
AI7 SuggestafiiclesSuggestafiiclesforablogaboutretro**TheEvolutionofArcadeCabinetThisafiiclewouldexploretheevolutionofarcadecabinetdesigns,fromtheearlywoodandmetalcabinetsofthe1970stothesleek,neon-litdesignsofthe1980sandbeyond.**BlastFromThePast:IconicArcadeGamesofTheThisafiiclewouldfeatureaglimpseintothehistory,impact,andlegacyofsomeofthemosticonicandinfluentialarcadegamesofthe1980s,highlightingtheirinnovations,popularmechanics,andenduringcharm.**TheRiseandRetroRevivalofPixelThisafiiclewoulddelveintotheevolutionofpixelafiasadefiningvisualstyleofearlyarcadegames,tracingitsroots,exploringitstechniques,anddiscussingtheresurgenceofpixelafiinmoderngamesanddigitalContext:Youarewritingforablogaboutretro80'sarcadevideogames.Suggest3topicstowriteanarticleaboutwithafewlinesofdescriptionofwhatthisarticleshouldcontain.Top-Top-gemini-Context:Youarewritingforablogaboutretro80'sarcadevideogames.S
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 病房管理精細(xì)化
- 腹股溝滑動疝的臨床護(hù)理
- 員工法律測試題及答案
- 在師帶徒儀式上的發(fā)言稿模版
- 化療相關(guān)護(hù)理
- 中國旅游地理新教學(xué)設(shè)計
- 護(hù)理組長崗位認(rèn)知
- 小學(xué)二年級班主任年終總結(jié)模版
- 2025年人教PEP英語小學(xué)四年級下冊期末檢測題及答案(二)
- 2025年部編版道德與法治小學(xué)四年級下冊期末質(zhì)量檢測題附答案
- GB/T 44241-2024虛擬電廠管理規(guī)范
- JGT 160-2017 混凝土用機(jī)械錨栓
- 南通輔警考試題庫
- 連續(xù)蒸煮螺旋喂料器
- DL-T904-2015火力發(fā)電廠技術(shù)經(jīng)濟(jì)指標(biāo)計算方法
- 新時代勞動教育教程(高校勞動教育課程)全套教學(xué)課件
- 北京市某低溫?zé)崴匕遢椛洳膳こ淘O(shè)計 畢業(yè)設(shè)計
- 懂設(shè)備原理會維護(hù)保養(yǎng)
- 英語中考專題復(fù)習(xí)-短文填空
- 《化妝品穩(wěn)定性試驗(yàn)規(guī)范》
- PPAP培訓(xùn)資料完整版-課件
評論
0/150
提交評論