POPULARITY
Njuz POPkast #57: Serije, filmovi, knjige, muzika i gaming preporuke! Nenad otkriva hit igru Clair Obscur, dok se raspravlja o detektivima (Ludwig), špijunima (Biró), bizarnim ubicama i novoj Snežani. U najnovijoj, možda i najhaotičnijoj epizodi, Jelisaveta briljira pamćenjem (ili vidovitošću?), Marko i Viktor se i dalje gube u srodničkim vezama (jetrva, pašenog, ko bi znao?!), a mi vas vodimo kroz lavirint novih preporuka! Plus, sve o Njuz šopu i našim predstojećim nastupima! Znamo da jedva čekate imena i linkove, ali moraćete malo da se strpite i uživate u našem nagađanju i povremenom (dobro, čestom) skretanju s teme!
Nekateri preostali vsebinski poudarki: - Osnovna šola Šmihel iz Novega mesta s prizidkom do nujnih dodatnih učilnic in kabinetov - V Kočevju v prenovo stare knjižnice. - Kranjska Komunala začenja gradnjo sodobnega upravno-izobraževalnega središča - Ob dnevu Zemlje na vóden pohod po Mašunski učni poti v osrčju Snežniško-Javorniškega masiva.
今回のほらボドは初企画!みんなでラジオドラマを作ってみようっ!momi邸に泊まりに来ていた役者4人+αで酔った勢いでエチュード(即興劇)やっちゃいましたよ(+o+)えぇ!■メンバー<50音順> ボドゲ大好き女子兼役者:内多優さんちゃがちゃがゲームズ兼役者:榎ちひろさんコロコロ堂兼役者:田邉安彦さんグループSNEお手伝い兼役者:比良恭子さんアークライト編集:はしもtoさんパーソナリティ:momi収録前は心底イヤそうにしていた面々もいざ始まれば超ノリノリだもんでついつい調子..
Pri nas več kot 50% potreb po pitni vodi zadostimo iz kraških vodonosnikov. Zato pri raziskovanju kraškega podzemlja ne gre le za raziskovalni temveč v veliki meri za življenski interes prostora in ljudi. Raziskovalec iz postojnskega Inštituta za raziskovanje krasa ZRC SAZU, fizik dr. Franci Gabrovšek že več let raziskuje “kraško zaledje izvirov Ljubljanice”. S približno 1000 kvadratnimi kilometri ozemlja (dvajsetina državnega), podzemni svet reke sedmerih imen poleg vodovarstvenih za krasoslovce pomeni tudi čisto raziskovalni izziv ozemlja, ki se rasteza od obronkov Snežnika do ljubljanske kotline. Gabrovšek je za oddajo že po predavanju na to temo pred nekaj leti povzel temeljna spoznanja krasoslovcev o dolgoletnih spremljanjih podvodnih tokov, ki izvirajo nekaj kilometrov od Jadranskega morja, prek Save in Donave pa se izlijejo v Črno morje. FOTO: Del porečja reke Ljubljanice VIR: https://metadata.izrk.zrc-sazu.si/srv/api/records/c5105cb7-fed4-496a-8e73-96d5f43ab025
Palestinsko skrajno gibanje Hamas je izpustilo tri izraelske talce v zameno za izpustitev 369 palestinskih zapornikov v izraelskih zaporih. Kljub izmenjavi je stanje napeto. Druge teme: - Mirovni pogovori o Ukrajini predvidoma brez prisotnosti EU - Srbski študenti zaprli Kragujevac, predsednik Vučić jih poziva k ustavitvi protestov - Snežni plazovi zajeli štiri osebe, razmere v gorah zelo zahtevne
Evropska unija obžaluje odločitev ameriških oblasti o uvedbi carin na uvoz iz Kanade, Mehike in Kitajske. Bruselj ob tem opozarja, da se bo povezava odločno odzvala, če bo ameriški predsednik Donald Trump uresničil svojo grožnjo in uvedel carine na uvoz iz Unije. K ostri drži Evrope proti Trumpu so pozvale med drugim francoske oblasti, britanske pa so sporočile, da Trump tvega resnično škodljiv vpliv na svetovno gospodarstvo. Drugi poudarki oddaje: - Izraelski premier Netanjahu v Washington na pogovore o drugi fazi premirja v Gazi. - V dijaškem domu Ivana Cankarja v Ljubljani, ki je po požaru znova odprl vrata, bo kmalu nameščen sistem požarne varnosti - Bodo pristave gradu Snežnik le dobile najemnika?
ENDLICH FREUtag
Le syndrome du nez vide (SNE) est une affection rare mais débilitante qui survient généralement après une chirurgie des voies nasales, notamment une turbinectomie partielle ou complète, réalisée pour traiter une obstruction nasale chronique. Les cornets nasaux, structures présentes à l'intérieur du nez, jouent un rôle essentiel dans le réchauffement, l'humidification et la filtration de l'air inspiré. Leur ablation excessive ou inappropriée peut entraîner une perturbation de ces fonctions, donnant lieu au SNE. Les personnes atteintes de ce syndrome rapportent des symptômes paradoxaux : malgré un nez structurellement ouvert et parfois trop dégagé, elles ressentent une sensation subjective de nez bouché ou de manque d'air. Cela s'explique par un dysfonctionnement des mécanismes sensoriels et nerveux du nez, associé à la perturbation du flux d'air. En d'autres termes, le cerveau perçoit mal l'air qui circule dans les voies nasales, provoquant une sensation d'étouffement ou une "faim d'air". Les symptômes courants incluent une sécheresse nasale intense, des croûtes douloureuses, des infections récurrentes, des troubles de l'odorat et une difficulté à respirer, même en l'absence d'obstruction physique. Ces problèmes entraînent souvent des impacts psychologiques significatifs, notamment de l'anxiété, de la dépression et une réduction importante de la qualité de vie. Le diagnostic du SNE est complexe, car il repose en grande partie sur les symptômes rapportés par le patient, qui peuvent sembler subjectifs. Les examens physiques ou les scans nasaux montrent parfois un nez anatomiquement normal ou ouvert, ce qui peut compliquer la reconnaissance de l'affection. Les options thérapeutiques sont limitées et visent principalement à soulager les symptômes. Elles incluent l'hydratation régulière des voies nasales, des sprays ou gels lubrifiants, et parfois des interventions chirurgicales visant à reconstruire ou combler le vide nasal à l'aide de greffes ou d'implants. Les traitements médicamenteux, tels que les antidépresseurs ou anxiolytiques, peuvent être proposés pour aider à gérer les aspects psychologiques de la maladie. Le SNE met en lumière l'importance d'une approche prudente dans la chirurgie nasale et d'une évaluation minutieuse des patients avant toute intervention. La sensibilisation des professionnels de santé et des patients est essentielle pour minimiser les risques et mieux gérer cette condition invalidante. Hébergé par Acast. Visitez acast.com/privacy pour plus d'informations.
Snežak odstopi lačnemu vrabčku svoj nos Pripoveduje: Tomaž Pipan. Napisal: Svetoslav Konstantinov Minkov. Prevedel: France Bevk. Posneto v studiih Radiotelevizije Ljubljana 1983.
V skladu s statusnim preoblikovanjem Vzajemne v delniško družbo bo ta jutri začela obveščati svoje člane o izračunih deležev. Gre za več kot 831 tisoč ljudi, približno polovica je delničarjev. Denar naj bi bil izplačan do konca leta. V oddaji tudi o tem: - Premirje za Gazo po mnenju vseh vpletenih le še korak stran. - Sanacija odlagališča Rakovnik v občini Šmartno pri Litiji bo končana do konca maja. - Pred 20-imi leti prenovljene pristave gradu Snežnik še brez najemnika.
Westmarkt: je nach Perspektive 'Safe Space'- oder 'Brennpunkt'-Viertel in einer Großstadt zwischen Rhein und Ruhr: Nizar und Snežana haben es raus geschafft. Er als Privatermittler, sie als Polizistin. Doch die Fäden von Westmarkts Netzen führen sie immer wieder zurück. Eigentlich soll Privatermittler Nizar einen Drogendeal im Darknet aufdecken, ein Jugendlicher ist gestorben. Doch das plötzliche Auftauchen seines Sohnes bringt ihn aus der Spur. | Von Selim Özdogan und Denis Kundic | Mit Taner Sahintürk, Mateja Meded, Eko Fresh, Denis Moschitto u.a. | Regie: Volkan T. Error | WDR 2024 | Podcast-Tipp: Der Mann, der Hunde liebte: https://1.ard.de/der-mann-der-hunde-liebte
Je starko skrbelo za Snežico ali je bila samo zelo radovedna. Tako radovedna, da je prelomila obljubo. Kaj se je zgodilo, izveš v japonski pravljici Beli žerjav. Še nekaj morda neznanih besed: Žerjav je velika ptica z dolgim vratom, dolgimi golimi nogami in navadno sivim perjem, v tej pravljici pa ima snežobelo perje. Snop je večji šop povezanega požetega žita sli slame v tej pravljici pa je to šop vej kolikor jih lahko neseš. Statve so priprava za tjanje blaga iz preje Preja je nit, spredena iz krajših ali daljših vlaken naravnega ali umetnega izvora: Prisluhni. Vir: Zlata ptica, Najlepša pahljača, japonske pravljice, prevedla Zdenka Škerlj Jerman, Mladinska knjiga 1996, Ljubljana, bere Nataša Holy
BRASLOVČE, MURSKA SOBOTA, ILIRSKA BISTRICA, KOČEVJE, ŽIROVNICA, LJUBLJANA - V drugi januarski oddaji »Iz naših krajev« ste lahko slišali, da bodo v občini Braslovče spomladi začeli graditi zdravstveni center. Poročali smo tudi o gradnji neprofitnih stanovanj v nekdanjih prostorih ekonomske šole v Murski Soboti ter, da je z novim letom zaživela civilna iniciativa za oživitev grajskih pristav gradu Snežnik. Spregovorili smo med drugim tudi o načrtih za nadaljnji razvoj kočevske občine, da v Žirovnici gradijo poseben nagnjen vetrovnik, ob sklepu oddaje pa smo pogledali še v Ljubljano, kjer bodo v ponedeljek na Pogačarjevem trgu odprli drsališče.
Po smrti nekdanjega ameriškega predsednika Jimmyja Carterja se vrstijo izrazi sožalja, hkrati pa tudi ocene njegove kompleksne zapuščine. Kot je dejal Carterjev biograf Jonathan Alter, je treba razlikovati med njegovo politično in humanitarno zapuščino. V Združenih državah je Carter politično propadel, po drugi strani pa ima vidne zunanjepolitične dosežke, na primer mirovni sporazum med Izraelom in Egiptom. Kot dodaja Alter, si je Carter močno prizadeval za trajni mir, kar mu razumljivo ni vedno uspelo. Druge teme: - Letošnja inflacija je 1,9-odstotna. - Civilna iniciativa Snežnik s podpisi proti postavitvi vetrnic v občini Ilirska Bistrica. - Skupno več kot 17 let zapora za starše otroka, ki je na beograjski šoli v strelskem napadu ubil 9 ljudi.
Po Vatikanu tudi Cerkev v Sloveniji vstopila v sveto leto.Papež Frančišek v poslanici za svetovni dan miru poziva h kulturnim in strukturnim spremembam družbe.Inflacija na letni ravni ta mesec 1,9-odstotna, opolnoči podražitev pogonskih goriv.Umrl nekdanji ameriški predsednik Carter, številni se ga spominjajo kot glasnika miru in človekoljubnosti.Milanović in Primorac v drugi krog hrvaških predsedniških volitev.Civilna iniciativa Snežnik ministrstvu za naravne vire in prostor predala tisoč podpisov proti vetrnim elektrarnam v Ilirski Bistrici.V Gazi se poleg vojne soočajo s hudim mrazom.V jedrski elektrarni v Krškem v študijo morebitnega podaljšanja obratovanja do 80 let.ŠPORT: Začetek novoletne skakalne turneje v znamenju Avstrijcev. Zmagal Kraft, Lanišek 15.Vreme: Jutri se bo ob morju lahko pojavila nizka oblačnost, drugod bo še precej jasno.
Like any grumps, we often ruminate on the past and what it has (and hasn't) brought us. After the closing of the old White Wolf forums post-Time of Judgment, those in search of community found our way to various places on the internet, one of the most noteworthy of which for the two of us was Shadow'n'Essence, a message board forum with extensive space for each of the WoD lines (and more!) in the 2000s. You may have heard us wax poetic about it in our online games episode a couple weeks ago. The vibrant atmosphere and creative energy among the Changeling fans in particular is something still fondly remembered, and sometimes people turn up years later to talk about it—like Chris Bern (aka Swordsman), former coordinator (aka Vilicus) of the Changeling section of the site, who joins us for this minisode walk down memory lane. Tune in to hear us talk about things as they once were, and if you were on SnE, we'd be delighted to hear from you...! Chris' repository of Changeling bits from the days of yore can be found at: https://theshepherdsfreehold.wordpress.com/. And as for us, it's the usual mishmash of places: Discord: https://discord.me/ctp Email: podcast@changelingthepodcast.com Facebook: https://www.facebook.com/profile.php?id=100082973960699 Mastodon: https://dice.camp/@ChangelingPod Patreon: https://www.patreon.com/changelingthepodcast YouTube: https://www.youtube.com/@ChangelingThePodcast your hosts Josh Hillerup (zenten) absconded with a Glamour slushie machine and the gods of internet past can't have it back. Pooka G (AlecRavager) will use however many Chronos Unleashings it takes to stir the slumbering PHPbeast. Should auld acquaintance be forgot, And never brought to min'? Should auld acquaintance be forgot, And days o' auld lang syne? —traditional, adapted by Robert Burns
„Taip, mūsų vaikai tikrai nepatogūs“, - sako sunkią kompleksinę negalią turinčius vaikus auginančios mamos. – „Žodis „nepatogu“ mus lydi kiekvieną dieną ir visur, kur mes su savo vaikais būname, ką bedarome. Jie napatogūs patys sau, nepatogūs aplinkiniams, kurie juos mato, nepatogus ir mūsų pačių gyvenimas“. Keturios mamos, auginančios tokius vaikus, įkūrė fondą „Nepatogūs“. Trys iš jų: Henrika Varnienė, Snežana Glebovė ir Laima Ingileikienė – dalyvauja šioje laidoje, skirtoje Tarptautinei žmonių su negalia dienai.Sunki kompleksinė negalia – dar mažai plačiajai visuomenei pažįstama tema. Laidoje pašnekovės dalinasi savo asmeninėmis istorijomis, pasakoja, su kokiomis problemomis ir džiaugsmais susiduria jos pačios ir jų vaikai. Ir kodėl buvo svarbu įkurti fondą, kuris rūpintųsi oriu ir patogiu gyvenimu jau užaugusiems, turintiems sunkią kompleksinę negalią vaikams, visiškai nepriklausomai nuo jų fizinių ar intelektinių galimybių.„Mūsų vaikams taip pat reikia bendravimo, meilės, dėmesio, draugų. Mūsų vaikams reikia žinių, ugdymo, pažinimo“. „Kiekviena mama, auginanti sunkią negalią turintį vaiką, sugeba prisitaikyti ir šešios valandos miego paroje yra lobis“. „Kviečiame jungtis ir kitas mamas, auginančias „nepatogius“ vaikus, tai tikrai nauja paslauga, o ne fantastinis filmas“, - sako laidoje dalyvaujančios moterys.Ved. Žydrė Gedrimaitė
Depuis une dizaine d'années, le groupe BD du Syndicat national de l'édition (SNE) organise partout en France des rencontres « BD en Régions ».Cette nouvelle édition se déroule en partenariat avec la Ville du Havre, Normandie Livre & Lecture, ainsi qu'avec le soutien de la SOFIA. À l'occasion d'Un Automne à buller*, la bibliothèque Oscar Niemeyer accueillera quatre tables rondes, le vendredi 15 novembre 2024, ayant vocation à éclairer le rôle spécifique de la bande dessinée et du manga dans l'accès à la lecture et à la connaissance.Considérés comme les genres littéraires favoris des jeunes français, la BD et le manga offrent un large spectre de procédés narratifs et d'univers graphiques qui leur confèrent désormais une place de choix dans les champs de la littérature et de la pensée.La force du dessin dans la narrationMayana Itoïz, Autrice BDFrédéric Lavabre, Directeur des éditions SarbacaneElvire de Cock, ColoristeNicolas Forsans, Éditeur, Éditions Glénat Modération : Jean-Noël Lafargue, Enseignant en école d'art
Depuis une dizaine d'années, le groupe BD du Syndicat national de l'édition (SNE) organise partout en France des rencontres « BD en Régions ».Cette nouvelle édition se déroule en partenariat avec la Ville du Havre, Normandie Livre & Lecture, ainsi qu'avec le soutien de la SOFIA. À l'occasion d'Un Automne à buller*, la bibliothèque Oscar Niemeyer accueillera quatre tables rondes, le vendredi 15 novembre 2024, ayant vocation à éclairer le rôle spécifique de la bande dessinée et du manga dans l'accès à la lecture et à la connaissance.Considérés comme les genres littéraires favoris des jeunes français, la BD et le manga offrent un large spectre de procédés narratifs et d'univers graphiques qui leur confèrent désormais une place de choix dans les champs de la littérature et de la pensée.Comment les mangas font lire les jeunesTiers Monde, Auteur manga/MangakaPascal Lafine, Directeur éditorial, Éditions DelcourtFanny Berlin, Chargée d'action culturelle, Pass CultureCamille Vasseur, BibliothécaireModération : Vincent Brunner, Journaliste
Depuis une dizaine d'années, le groupe BD du Syndicat national de l'édition (SNE) organise partout en France des rencontres « BD en Régions ».Cette nouvelle édition se déroule en partenariat avec la Ville du Havre, Normandie Livre & Lecture, ainsi qu'avec le soutien de la SOFIA. À l'occasion d'Un Automne à buller*, la bibliothèque Oscar Niemeyer accueillera quatre tables rondes, le vendredi 15 novembre 2024, ayant vocation à éclairer le rôle spécifique de la bande dessinée et du manga dans l'accès à la lecture et à la connaissance.Considérés comme les genres littéraires favoris des jeunes français, la BD et le manga offrent un large spectre de procédés narratifs et d'univers graphiques qui leur confèrent désormais une place de choix dans les champs de la littérature et de la pensée.La BD comme instrument du savoirPascal Mériaux, Co-créateur des Rendez-Vous de la Bande Dessinée d'Amiens, directeur du pôle BD Hauts-de-FranceMarie-Agnès Le Roux, Éditrice, Éditions Futuropolis Wilfrid Lupano, Auteur BDSandrine Metterie, Directrice de l'atelier Canopé du HavreModération : Cécilia Le Métayer, Responsable du pôle jeunesse/BD au Havre
Nekateri preostali vsebinski poudarki oddaje: - Proračun mariborske občine naj bi prihodnje leto dosegel rekordne višave. - V Sežani postavljajo prvo skupnostno sončno elektrarno. - Radovljica se na turističnem zemljevidu vse bolj uveljavlja kot mesto doživetij. - Planinska koča na Snežniku je ena od redkih pri nas, do katere planinci peš znosijo vse potrebno za preskrbo.
初の幕張メッセ開催となるゲームマーケット2024秋が今週末と迫って参りました!直前となる今回のほらボドは新作頒布品披露!花と夢Webで原作をつとめる種袋ルネッサさんとほらボドのちゅぱみさんがタッグを組んだサークル「カレーうどん部」がお送りする2人用新作マーダーミステリー!さらには酢豚さんによるグループSNE&B-CAFE新作紹介となっております!■メンバー カレーうどん部:ちゅぱみさん&種袋ルネッサさんグループSNE:酢豚さんパーソナリティ:momi■新作頒布品 ..
In this podcast episode, Roland and Anthony discuss the Godfather of AI, Geoffrey Hinton, who developed pivotal algorithms like backpropagation, contributed to neural visualization with t-SNE, and inspired a resurgence in neural networks with AlexNet's success. Hinton now voices concerns about AI's future, addressing both its potential benefits and its risks. They then turn to John von Neumann, the Godfather of Programming, whose vast impact spanned mathematics, the Manhattan Project, and game theory, but most importantly: the von Neumann computer hardware architecture. Read a transcript of this interview: https://bit.ly/3NGiTXb Subscribe to the Software Architects' Newsletter for your monthly guide to the essential news and experience from industry peers on emerging patterns and technologies: https://www.infoq.com/software-architects-newsletter Upcoming Events: QCon San Francisco (November 18-22, 2024) Get practical inspiration and best practices on emerging software trends directly from senior software developers at early adopter companies. https://qconsf.com/ QCon London (April 7-9, 2025) Discover new ideas and insights from senior practitioners driving change and innovation in software development. https://qconlondon.com/ Save the date: InfoQ Dev Summit Boston (June 9-10, 2025) Actionable insights on today's critical dev priorities. devsummit.infoq.com/conference/boston2025 The InfoQ Podcasts: Weekly inspiration to drive innovation and build great teams from senior software leaders. Listen to all our podcasts and read interview transcripts: - The InfoQ Podcast https://www.infoq.com/podcasts/ - Engineering Culture Podcast by InfoQ https://www.infoq.com/podcasts/#engineering_culture - Generally AI: https://www.infoq.com/generally-ai-podcast/ Follow InfoQ: - Mastodon: https://techhub.social/@infoq - Twitter: twitter.com/InfoQ - LinkedIn: www.linkedin.com/company/infoq - Facebook: bit.ly/2jmlyG8 - Instagram: @infoqdotcom - Youtube: www.youtube.com/infoq Write for InfoQ: Learn and share the changes and innovations in professional software development. - Join a community of experts. - Increase your visibility. - Grow your career. https://www.infoq.com/write-for-infoq
Drobna deklica, tako majhna in nemočna, da so jo skušali vsi izkoristiti. Toda zaradi sboje dobrote je pridobila prijateljico, ki ji je pomagala premagati stiske in jo popeljala v deželo, kjer je lahko srečno živela. Veš kaj je strnišče? To je njiva, na kateri je bilo požeto žito. Vir: Hans Christian Andersen, Snežna kraljica in druge pravljice, prevedel Rudolf Kresal, Mladinska knjiga, Ljubljana, 1953, bere Nataša Holy
Am stat de vorba cu Sneška Quaedvlieg-Mihailović, secretar general al retelei Europa Nostra despre importanța rețelei si despre viitorul Summit European al Patrimoniului, care se va derula la București intre 6 si 8 octombrie. Iar Tilla Rudel, directoarea Institutului Francez din Romania la Timisoara ne-a vorbit despre o noua editie a Noptii Filosofiei, care are loc vineri, 20 septembrie.
Mogoče se vam bo zdelo čudno, saj je do zime še daleč, a mogoče vam bo malo hladu po res vročem poletju prav prijalo. Zgodba znamenitega danskega pravljičarja Hansa Christiana Andersena je sanjska in vznemirljiva, a tudi malo otožna. Snežna kraljica odpelje nekega dečka na svoj grad. Njegova prijateljica Gerda se poda na pot, da bi ga našla in pripeljala nazaj. Režiser: Jože Vozny Prevajalka, prirejevalka in dramaturginja: Djurdja Flere Tonska mojstrica: Metka Rojc Avtor izvirne glasbe: Marijan Vodopivec Snežna kraljica – Štefka Drolc Kay – Iztok Čebular Gerda – Jana Osojnik Pripovedovalec – Rudi Kosmač Babica – Elvira Kralj Lastovka – Mira Bedenk Starka – Duša Počkaj Rožni grm – Andrej Kurent Vran – Polde Bibič Vrana – Iva Zupančič Princ – Danilo Benedičič Princeska – Milena Zupančič Mala razbojnica – Alja Tkačev Prvi golob – Dušan Škedl Drugi golob – Danilo Bezlaj Severni jelen – Saša Miklavc Laponka – Sava Sever Produkcija Uredništva igranega programa. Posneto v studiih Radia Ljubljana decembra 1968.
Slovenija ima zelo bogat jamski svet. Njegov del so tudi snežne jame, v katerih je več sto ali celo tisoč let star led. V tem ujetem ledu se skriva veliko podatkov o oddaljeni preteklosti: od strukture ozračja do vrste peloda in sestave prsti.Znanstveni sodelavec Geografskega inštituta Antona Melika pri ZRC SAZU dr. Jure Tičar s sodelavci raziskuje številne slovenske jame, v zadnjem času predvsem Snežno jamo na Raduhi. Poleg sledi preteklosti se v njej čuti tudi močan vpliv sedanjosti. Ekstremni poletni nalivi namreč puščajo sledi tudi pod zemljo in topel dež denimo vse hitreje raztaplja tudi podzemni led, prav tako se jamah čuti vseprisotno onesnaženje. 3000 naših jam ali 20 odstotkov je že onesnaženih.
Tema ove epizode je postavljanje zdravih granica o kojoj pričam s mojom dragom kolegicom Snežanom koja je akspert za ovu temu. Gotovo uvijek kad osjećamo preveliki protisak ili burnout radi se o nedostatno postavljenim granicama. Snežanin Facebook profil: https://www.facebook.com/sneza.maksimovic.1 Za osobni ili grupni coaching javite se preko linka: https://prosperityplatform.my.canva.site/ #granice
SPECIAL OFFER: Join the BBZPoker Discord the day after the podcast goes live and get entered into a lottery for a free annual subscription! https://discord.gg/E9e7Jx49Jordan “BBZ” Drummond has been a mainstay in the world of poker for over 15 years. He started playing poker in 2009 with $7, engaging in 10 cent 360 mans and 25 cent 45 mans, and impressively spun it up to $750k over the next five years by playing SNGs and grinding Supernova Elite four times. Simultaneously, he built a staking group that eventually included approximately 150 active players. Following the termination of SNE, Jordan focused on staking and coaching until 2019 when he began streaming. He played and streamed high-stakes MTTs for a year and a half to promote BBZPoker, a subscription-based coaching service, and obtained a sponsorship from partypoker, which he gave up when he decided to stop streaming full-time in 2021.As a seasoned professional, Jordan has not only achieved sustained success in high-stakes poker but has also made a lasting impact as a staker, coach, and mentor. His commitment to nurturing talent is evident in his mentorship of hundreds of students, guiding them to achieve their full potential in the competitive realm of poker. In 2012, Jordan founded BBZpoker.com, a platform that has grown into a leading provider of affordable and high-level poker training content. Under his leadership, BBZpoker.com plays a pivotal role in shaping the skills of countless poker enthusiasts, contributing to the overall growth and excellence of the poker community. Jordan “BBZ” Drummond's enduring influence in the poker world is a testament to his dedication, expertise, and passion for the game.Here is what you can expect to hear in this week's episode:0:00 Introduction00:25 The Most Experienced Poker Coach7:05 Easy to Choose Winners? Selecting Your Students10:00 Helping with Life Leaks12:57 Learning How a Spot Works – Eliminating the Tilt16:26 Making Millionaires24:58 Getting Screwed Over – A Predictable Pattern32:02 Audit Mechanisms to Stop Cheating43:53 Why Start Staking?47:53 An Interest in Self-Improvement57:47 Jordan's Reading List1:03:12 Persistence and Durability – An Immense Skill for Poker and Life1:07:59 Accountability Systems in PokerFollow Jordan Drummond and BBZ Staking:X (Formerly Twitter): @bbzpoker Website: https://www.bbzpoker.comYouTube: @BBZPoker Join the Discord Server: https://discord.gg/E9e7Jx49Follow "Jungleman" Dan Cates:Instagram:@thedancatesX (Formerly Twitter): @junglemandanJoin Poker Academy today using this link: https://www.preflop.academy/?via=danPoker strategy tipsPoker tournament highlightsPoker player profilesPoker player rankingsPoker coachingPoker mental game
Dejiny Srbov na Slovensku sú spojené s Rakúsko-Uhorskom. V 19. storočí mnoho srbských vzdelancov študovalo na lýceách na Slovensku. Výraznejšia enkláva sa tu vytvorila po vojne na Balkáne a v súčasnosti sa k srbskej národnosti hlási viac ako 1000 obyvateľov Slovenskej republiky. Diskusné stretnutia realizuje občianske združenie In Minorita v spolupráci s týždenníkom .týždeň. Realizované s finančnou podporou Fondu na podporu kultúry národnostných menšín. Diskutovať budú zástupkyne srbskej národnostnej menšiny Snežana Jović-Werner a Matea Bucalo. Moderujú Zuza Kumanová a Michal Oláh.
Stacey and J Sbu recently received the following confession: "Hi Stacey and J Sbu, it's Sne. People are copying my mannerisms, words, outfits and even my major at university. It's unsettling and makes me want to withdraw from social media. This even happens at the office and my friend copies everything, including home decor." The dynamic duo asked KZN if they ever had any copycats and how they managed the situation. Locals phoned in, sent voice notes and messages, spilling all the tea about their copycats.
V evrovizijskem tednu bosta vodenje Radia Ga Ga - Nova generacija prevzeli legendi domače glasbene scene Jože Potrebuješ in Helena Blagne, ki bosta ob domačih aktualnih dogodkih seveda komentirala tudi nastope na največjem odru na svetu. Z nami bodo tudi vsi stalni sodelavci od Franca in Rožmarina s prometnimi informacijami, pa Uroša Slaka, ki se bo z dr. Snežičem in dr. Vodebom pogovarjal o razmerjih, do Stresa in Krambergerja, ki se bosta v duhovni misli spomnila na propadli Zvon. V novi rubriki Šank Tank bodo znani domači poslovneži pretresali sveže poslovne ideje, Robert nam bo predstavil nov turistični paket za ljubitelje križarjenj, izvedeli bomo vse o Veselica Fajt Klubu, če nam ostane kaj časa, pa bomo prisluhnili še sestanku vodstva novega RTV projekta imenovanega "Sprava TV". Vse to in verjetno še kaj v petek ob 10h na Prvem programu Radia Slovenija.
Ready to indulge in the culinary spectacle of MasterChef Australia? Then it's time to fire up the oven and don your kitchen apron, because among this year's 22 contestants is Snežana Čalić, proudly representing Serbian heritage. Nina Marković caught up with her to chat about the exciting journey ahead. - Ако волите да гледате популарну емисију MasterChef Australia, време је да загрејете рерну и навучете кухињску кецељу поебно зато што је међу 22 такмичара ове године и Сенжана Чалић која је српског порекла. Нина Марковић је тим поводом реазговарала са њом.
Dugogodišnja članica i sekretarica hora "Sevdalinka", gospođa Snežana Barošević govori o uspješnim nastupima u prošloj godini, dobrotvornom koncertu koji pripremaju za 20. april u Seafordu, te planovima u narednom periodu. Hor "Sevdalinka" pronosi Australijom ljepotu narodne bosanskohercegovačke pjesme kao najljepšeg dijela neprocjenjivog kulturnog blaga Prve domovine. Odolijevajući vremenu i svim iskušenjima koje njegov tok neminovno donosi, hor je ne samo opstao, nego je i unaprijedio svoj kvalitet, pretvorivši se u instituciju kreativnosti i zajedništva svojih članova. Svim dobronamjernicima hor nudi istinsko zadovoljstvo dobre pjesme, ali i prijateljsko ozračje. "Sevdalinku" je vodio dugi niz godina moj rahmetli kolega sa SBS -a, Amir Bukić.
Poslušajte novu epizodu podkasta Iza vesti sa Snežanom Milivojevićem.
Les difficultés d'accès à l'électricité sont un souci de longue date à Ndjamena. Beaucoup estiment que la situation ne cesse d'empirer, et craignent de passer des semaines difficiles, alors qu'on entre dans la période la plus chaude de l'année, et que la période du ramadan approche. De notre envoyé spécial à Ndjamena,L'air brassé par le ventilateur du petit salon de Khonon est un soulagement, alors que la température dépasse les 40°C dans le quartier Madagascar de Walia. Ce petit confort, ce professeur de karaté ne le doit qu'à lui-même. Il a longuement économisé pour une installation solaire.« C'est difficile d'avoir l'argent. Chez nous, ici, le travail, ce n'est pas facile d'en avoir. L'électricité chez nous, c'est déjà un luxe. Même si tu as l'électricité, ça veut dire qu'à la fin du mois par exemple, on te donne une facture ou tu n'arrives même pas à payer et tu n'as même pas le jus, encore. Donc le mieux c'est d'avoir des panneaux, tu es tranquille », explique Khonon.À écouter aussiÀ Ndjamena, les délestages d'électricité pourrissent le quotidien des habitantsPas de courant malgré les travauxAu milieu de la rue, des poteaux et des câbles électriques posés voici quatre mois par la Société nationale d'électricité. Les riverains ont cotisé pour l'installation, mais il n'y a toujours pas de courant, nous explique Jacques, le neveu de Khonon. Il habite juste en face.« Les câbles sont passés, mais jusque-là, la SNE n'est pas encore revenue pour pouvoir donner la connexion dans chaque concession. Nous nous sommes décarcassés pour avoir des dizaines de millions [de francs CFA], faire passer des dizaines de poteaux et nous sommes encore sans jus », constate Jacques.Étudiant, il a pu s'offrir un panneau avec ses petits boulots, mais pas encore la batterie qui lui permettra de travailler après le coucher du soleil.« [Pour] les études, il faut aussi faire des recherches à côté, il faut aussi rédiger des devoirs, il faut faire des traités, des exposés... Et l'électricité, c'est un luxe, comme on l'a dit. Ce n'est pas facile d'avoir l'électricité et même en journée. Même à l'université, il n'y a pas de l'électricité tous les jours, il faut amener 100 francs ou 200. Charger à la cabine, cela devient trop coûteux et trop pesant pour moi. Je me suis dit qu'il faut que j'économise petit à petit, on va tasser la ceinture, on va dormir aussi affamés pour ne serait-ce qu'avoir de l'électricité », indique l'étudiant.À écouter aussiTchad: les coupures d'électricité trop fréquentes exaspèrent la populationVers des améliorations les prochaines semaines ?Augmentation de la population et étalement urbain, générateurs trop peu nombreux, en panne ou en maintenance, mauvaise gestion et détournement de carburant… Les causes des délestage sont nombreuses. Les autorités promettent des solutions depuis des années.Ministre de l'Énergie depuis janvier, Louise Ndougonna Mbakasse Riradjim le reconnait, mais elle espère des améliorations dans les prochaines semaines. « Il y a deux grands projets, un à Gassi et un autre à Djermaya. Les deux doivent produire 62 mégawatts, le tiers du besoin en électricité de la ville de Ndjamena. En mars, ce n'est pas possible de les avoir, ça sera en avril en pleine canicule », déclare la ministre.Selon elle, plusieurs partenariats avec des entreprises privées sont en cours de finalisation pour produire de l'électricité supplémentaire, notamment via des champs solaires.
Snežinke rade počno še kaj drugega, kot da na gosto snežijo! Pripoveduje: Nina Valič. Napisala: Alenka Juvan. Posneto v studiih Radia Slovenija 2006.
Sony recently shared their financial results in an earnings report. CEO Hiroki Totoki delineated his vision for the company during the call, affirming, "I would like to demonstrate leadership and create a harmonized approach towards achieving overall growth, sustainable profitability, and increasing margins. We are focused on optimizing our business, improving cost efficiency, and driving profitability in both the gaming and semiconductor segments. For gaming, we will continue to prioritize user engagement and strike a balance between hardware sales, network services, and first-party titles. In the semiconductor segment, we are committed to managing costs and investments in order to improve profit margins while meeting market demands for higher functionality and performance. Our goal is to drive profitability and deliver sustainable returns to our shareholders." An analysis of Sony's business nuances reflects a correlation between their recent operations and the principles emphasized in their latest earnings call. Sony's success seems to largely reside in the gaming sector, specifically tied to the increasing demand for PS5 games, which Sony emphasized during the call. Sony's development of a new PlayStation controller designed for gamers with physical disabilities demonstrates their commitment to enhancing accessibility and user engagement, which was previously highlighted as a strategic focus. Additionally, the introduction of a new home theater system is aligned with Sony's commitment to high-quality entertainment, enhancing their focus on content creation. Sony's strategic move into the mobile gaming sector with PlayStation Studios Mobile Division mirrors their ambition to explore new revenue opportunities and growth markets, as conveyed during the call. Similarly, their acquisition of Savage Game Studios points towards a focus on content development and a penchant towards the expanding mobile gaming industry. Later in the call, Totoki took time to delve into Monthly Active User (MAU) numbers. He attributed the increase to seasonality, commenting, "Regarding MAU, for one thing, there is seasonality. The third quarter is the holiday season. So there's seasonality factor. And then free-to-play titles, we had the big hits. So we are enjoying benefits from that. These are the 2 major factors drivers for increasing the MAU." Based on the information presented in the earnings call, Sony's recent business decisions appear to be strategically aligned with the broader vision conveyed by the company. Their continuous efforts in exploring thriving gaming trends, enhancing user engagement, and moving towards growth markets show an enterprise-wide commitment to innovation. However, these initiatives and future performance should be evaluated with regular market dynamics and company-specific risks in mind, as their actual impact may vary. Ongoing scrutiny and thorough analysis of their implementations should give a more discernible trajectory for the company in the near future. The reported commitment to cost management, user engagement, and strategic growth, although promising, will need clear execution and market acceptance to fully contribute to Sony's goal of driving profitability and sustainable returns to shareholders. SNE Company info: https://finance.yahoo.com/quote/SNE/profile For more PSFK research : www.psfk.com This email has been published and shared for the purpose of business research and is not intended as investment advice.
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Investigating Bias Representations in LLMs via Activation Steering, published by DawnLu on January 15, 2024 on The AI Alignment Forum. Produced as part of the SPAR program (fall 2023) under the mentorship of Nina Rimsky. Introduction Given recent advances in the AI field, it's highly likely LLMs will be increasingly used to make decisions that have broad societal impact - such as resume screening, college admissions, criminal justice, etc. Therefore it will become imperative to ensure these models don't perpetuate harmful societal biases. One way we can evaluate whether a model is likely to exhibit biased behavior is via red-teaming. Red-teaming is the process of "attacking" or challenging a system from an adversarial lens with the ultimate goal of identifying vulnerabilities. The underlying premise is that if small perturbation in the model can result in undesired behaviors, then the model is not robust. In this research project, I evaluate the robustness of Llama-2-7b-chat along different dimensions of societal bias by using activation steering. This can be viewed as a diagnostic test: if we can "easily" elicit biased responses, then this suggests the model is likely unfit to be used for sensitive applications. Furthermore, experimenting with activation steering enables us to investigate and better understand how the model internally represents different types of societal bias, which could help to design targeted interventions (e.g. fine-tuning signals of a certain type). Methodology & data Activation steering (also known as representation engineering) is a method used to steer an LLMs response towards or away from a concept of interest by perturbing the model's activations during the forward pass. I perform this perturbation by adding a steering vector to the residual stream at some layer (at every token position after an initial prompt). The steering vector is computed by taking the average difference in residual stream activations between pairs of biased (stereotype) and unbiased (anti-stereotype) prompts at that layer. By taking the difference between paired prompts, we can effectively remove contextual noise and only retain the "bias" direction. This approach to activation steering is known as Contrastive Activation Addition [1]. For the data used to generate the steering vectors, I used the StereoSet Dataset, which is a large-scale natural English dataset intended to measure stereotypical biases across various domains. In addition, I custom wrote a set of gender-bias prompts and used chatGPT 4 to generate similar examples. Then I re-formatted all these examples into multiple choice A/B questions (gender data available here and StereoSet data here). In the example below, by appending (A) to the prompt, we can condition the model to behave in a biased way and vice versa. A notebook to generate the steering vectors can be found here, and a notebook to get steered responses here. Activation clusters With the StereoSet data and custom gender-bias prompts, I was able to focus on three dimensions of societal biases: gender, race, and religion. The graphs below show a t-SNE projection of the activations for the paired prompts. We see that there's relatively good separation between the stereotype & anti-stereotype examples, especially for gender and race. This provides some confidence that the steering vectors constructed from these activations will be effective. Notice that the race dataset has the largest sample size. Steered responses For the prompts used to evaluate the steering vectors, I chose this template, which was presented in a paper titled On Biases in Language Generation [2]. For comparison purposes, I first obtained the original responses from Llama 2-7B (without any steering). There are two key callouts: (1) the model is already biased on the gender ...
V prvej novoročnej epizóde filmového podcastu denníka SME – Vertigo, ponúkneme tipy na filmy, ktoré aktuálne nájdete v našich kinách a na streamovacích službách. Zhodnotíme novinky ako Ferrari, Zásah šťastím, Aspoň jeden gól, či Club Zero, ale pozrieme sa aj na netflix tituly Maestro a Snežné bratstvo. Samozrejme sa vrátime aj k udeľovaniu Zlatých glóbusov. (00:00) Úvod (00:32) Ferrari (08:13) Prvý deň môjho života / Il primo giorno della mia vita (12:27) Zásah šťastím / Coup de Chance (18:31) Club Zero (26:07) Apoň jeden gól / Next Goal Wins (31:19) Maestro (Netflix) (37:32) Fargo 5 sezóna (Hulu) (48:11) Slepačí úlet: Zrodenie nugiet / Slepaci Ulet 2 / Chicken Run: Dawn of the Nugget (Netflix) (53:21) The Golden Globe Awards (57:44) Záver_ Ak nám chcete napísať, ozvite sa na vertigo@sme.sk _ Ďakujeme, že počúvate podcast Vertigo a zaujímate sa o filmový svet
We are running an end of year listener survey! Please let us know any feedback you have, what episodes resonated with you, and guest requests for 2024! Survey link here.NeurIPS 2023 took place from Dec 10–16 in New Orleans. The Latent Space crew was onsite for as many of the talks and workshops as we could attend (and more importantly, hosted cocktails and parties after hours)!Picking from the 3586 papers accepted to the conference (available online, full schedule here) is an impossible task, but we did our best to present an audio guide with brief commentary on each. We also recommend MLContests.com NeurIPS recap and Seb Ruder's NeurIPS primer. We also found the VizHub guide useful for a t-SNE clustering of papers.We'll start with the NeurIPS Best Paper Awards, and then go to a selection of non-awarded but highly influential papers, and then arbitrary personal picks to round out the selection. Where we were able to do a poster session interview, please scroll to the relevant show notes for images of their poster for discussion. We give Chris Ré the last word due to the Mamba and StripedHyena state space models drawing particular excitement but still being too early to assess impact. Timestamps* [0:01:19] Word2Vec (Jeff Dean, Greg Corrado)* [0:15:28] Emergence Mirage (Rylan Schaeffer)* [0:28:48] DPO (Rafael Rafailov)* [0:41:36] DPO Poster Session (Archit Sharma)* [0:52:03] Datablations (Niklas Muennighoff)* [1:00:50] QLoRA (Tim Dettmers)* [1:12:23] DataComp (Samir Gadre)* [1:25:38] DataComp Poster Session (Samir Gadre, Alex Dimakis)* [1:35:25] LLaVA (Haotian Liu)* [1:47:21] LLaVA Poster Session (Haotian Liu)* [1:59:19] Tree of Thought (Shunyu Yao)* [2:11:27] Tree of Thought Poster Session (Shunyu Yao)* [2:20:09] Toolformer (Jane Dwivedi-Yu)* [2:32:26] Voyager (Guanzhi Wang)* [2:45:14] CogEval (Ida Momennejad)* [2:59:41] State Space Models (Chris Ré)Papers covered* Distributed Representations of Words and Phrases and their Compositionality (Word2Vec) Tomas Mikolov · Ilya Sutskever · Kai Chen · Greg Corrado · Jeff Dean. The recently introduced continuous Skip-gram model is an efficient method for learning high-quality distributed vector representations that capture a large number of precise syntactic and semantic word relationships. In this paper we present several improvements that make the Skip-gram model more expressive and enable it to learn higher quality vectors more rapidly. We show that by subsampling frequent words we obtain significant speedup, and also learn higher quality representations as measured by our tasks. We also introduce Negative Sampling, a simplified variant of Noise Contrastive Estimation (NCE) that learns more accurate vectors for frequent words compared to the hierarchical softmax. An inherent limitation of word representations is their indifference to word order and their inability to represent idiomatic phrases. For example, the meanings of Canada'' and "Air'' cannot be easily combined to obtain "Air Canada''. Motivated by this example, we present a simple and efficient method for finding phrases, and show that their vector representations can be accurately learned by the Skip-gram model.* Are Emergent Abilities of Large Language Models a Mirage? (Schaeffer et al.). Emergent abilities are abilities that are present in large-scale models but not in smaller models and are hard to predict. Rather than being a product of models' scaling behavior, this paper argues that emergent abilities are mainly an artifact of the choice of metric used to evaluate them. Specifically, nonlinear and discontinuous metrics can lead to sharp and unpredictable changes in model performance. Indeed, the authors find that when accuracy is changed to a continuous metric for arithmetic tasks where emergent behavior was previously observed, performance improves smoothly instead. So while emergent abilities may still exist, they should be properly controlled and researchers should consider how the chosen metric interacts with the model.* Direct Preference Optimization: Your Language Model is Secretly a Reward Model (Rafailov et al.)* While large-scale unsupervised language models (LMs) learn broad world knowledge and some reasoning skills, achieving precise control of their behavior is difficult due to the completely unsupervised nature of their training. Existing methods for gaining such steerability collect human labels of the relative quality of model generations and fine-tune the unsupervised LM to align with these preferences, often with reinforcement learning from human feedback (RLHF). However, RLHF is a complex and often unstable procedure, first fitting a reward model that reflects the human preferences, and then fine-tuning the large unsupervised LM using reinforcement learning to maximize this estimated reward without drifting too far from the original model. * In this paper, we leverage a mapping between reward functions and optimal policies to show that this constrained reward maximization problem can be optimized exactly with a single stage of policy training, essentially solving a classification problem on the human preference data. The resulting algorithm, which we call Direct Preference Optimization (DPO), is stable, performant, and computationally lightweight, eliminating the need for fitting a reward model, sampling from the LM during fine-tuning, or performing significant hyperparameter tuning. * Our experiments show that DPO can fine-tune LMs to align with human preferences as well as or better than existing methods. Notably, fine-tuning with DPO exceeds RLHF's ability to control sentiment of generations and improves response quality in summarization and single-turn dialogue while being substantially simpler to implement and train.* Scaling Data-Constrained Language Models (Muennighoff et al.)* The current trend of scaling language models involves increasing both parameter count and training dataset size. Extrapolating this trend suggests that training dataset size may soon be limited by the amount of text data available on the internet. Motivated by this limit, we investigate scaling language models in data-constrained regimes. Specifically, we run a large set of experiments varying the extent of data repetition and compute budget, ranging up to 900 billion training tokens and 9 billion parameter models. We find that with constrained data for a fixed compute budget, training with up to 4 epochs of repeated data yields negligible changes to loss compared to having unique data. However, with more repetition, the value of adding compute eventually decays to zero. We propose and empirically validate a scaling law for compute optimality that accounts for the decreasing value of repeated tokens and excess parameters. Finally, we experiment with approaches mitigating data scarcity, including augmenting the training dataset with code data or removing commonly used filters. Models and datasets from our 400 training runs are freely available at https://github.com/huggingface/datablations.* QLoRA: Efficient Finetuning of Quantized LLMs (Dettmers et al.). * This paper proposes QLoRA, a more memory-efficient (but slower) version of LoRA that uses several optimization tricks to save memory. They train a new model, Guanaco, that is fine-tuned only on a single GPU for 24h and outperforms previous models on the Vicuna benchmark. Overall, QLoRA enables using much fewer GPU memory for fine-tuning LLMs. Concurrently, other methods such as 4-bit LoRA quantization have been developed that achieve similar results.* DataComp: In search of the next generation of multimodal datasets (Gadre et al.)* Multimodal datasets are a critical component in recent breakthroughs such as CLIP, Stable Diffusion and GPT-4, yet their design does not receive the same research attention as model architectures or training algorithms. To address this shortcoming in the machine learning ecosystem, we introduce DataComp, a testbed for dataset experiments centered around a new candidate pool of 12.8 billion image-text pairs from Common Crawl. Participants in our benchmark design new filtering techniques or curate new data sources and then evaluate their new dataset by running our standardized CLIP training code and testing the resulting model on 38 downstream test sets. * Our benchmark consists of multiple compute scales spanning four orders of magnitude, which enables the study of scaling trends and makes the benchmark accessible to researchers with varying resources. Our baseline experiments show that the DataComp workflow leads to better training sets. Our best baseline, DataComp-1B, enables training a CLIP ViT-L/14 from scratch to 79.2% zero-shot accuracy on ImageNet, outperforming OpenAI's CLIP ViT-L/14 by 3.7 percentage points while using the same training procedure and compute. We release datanet and all accompanying code at www.datacomp.ai.* Visual Instruction Tuning (Liu et al)* Instruction tuning large language models (LLMs) using machine-generated instruction-following data has improved zero-shot capabilities on new tasks, but the idea is less explored in the multimodal field. In this paper, we present the first attempt to use language-only GPT-4 to generate multimodal language-image instruction-following data. * By instruction tuning on such generated data, we introduce LLaVA: Large Language and Vision Assistant, an end-to-end trained large multimodal model that connects a vision encoder and LLM for general-purpose visual and language understanding.* Our early experiments show that LLaVA demonstrates impressive multimodel chat abilities, sometimes exhibiting the behaviors of multimodal GPT-4 on unseen images/instructions, and yields a 85.1% relative score compared with GPT-4 on a synthetic multimodal instruction-following dataset. When fine-tuned on Science QA, the synergy of LLaVA and GPT-4 achieves a new state-of-the-art accuracy of 92.53%. We make GPT-4 generated visual instruction tuning data, our model and code base publicly available.* Tree of Thoughts: Deliberate Problem Solving with Large Language Models (Yao et al)* Language models are increasingly being deployed for general problem solving across a wide range of tasks, but are still confined to token-level, left-to-right decision-making processes during inference. This means they can fall short in tasks that require exploration, strategic lookahead, or where initial decisions play a pivotal role. * To surmount these challenges, we introduce a new framework for language model inference, Tree of Thoughts (ToT), which generalizes over the popular Chain of Thought approach to prompting language models, and enables exploration over coherent units of text (thoughts) that serve as intermediate steps toward problem solving. * ToT allows LMs to perform deliberate decision making by considering multiple different reasoning paths and self-evaluating choices to decide the next course of action, as well as looking ahead or backtracking when necessary to make global choices.* Our experiments show that ToT significantly enhances language models' problem-solving abilities on three novel tasks requiring non-trivial planning or search: Game of 24, Creative Writing, and Mini Crosswords. For instance, in Game of 24, while GPT-4 with chain-of-thought prompting only solved 4% of tasks, our method achieved a success rate of 74%. * Code repo with all prompts: https://github.com/princeton-nlp/tree-of-thought-llm.* Toolformer: Language Models Can Teach Themselves to Use Tools (Schick et al)* LMs exhibit remarkable abilities to solve new tasks from just a few examples or textual instructions, especially at scale. They also, paradoxically, struggle with basic functionality, such as arithmetic or factual lookup, where much simpler and smaller specialized models excel. * In this paper, we show that LMs can teach themselves to use external tools via simple APIs and achieve the best of both worlds. * We introduce Toolformer, a model trained to decide which APIs to call, when to call them, what arguments to pass, and how to best incorporate the results into future token prediction. * This is done in a self-supervised way, requiring nothing more than a handful of demonstrations for each API. We incorporate a range of tools, including a calculator, a Q&A system, a search engine, a translation system, and a calendar. * Toolformer achieves substantially improved zero-shot performance across a variety of downstream tasks, often competitive with much larger models, without sacrificing its core language modeling abilities.* Voyager: An Open-Ended Embodied Agent with Large Language Models (Wang et al)* We introduce Voyager, the first LLM-powered embodied lifelong learning agent in Minecraft that continuously explores the world, acquires diverse skills, and makes novel discoveries without human intervention. Voyager consists of three key components: * 1) an automatic curriculum that maximizes exploration, * 2) an ever-growing skill library of executable code for storing and retrieving complex behaviors, and * 3) a new iterative prompting mechanism that incorporates environment feedback, execution errors, and self-verification for program improvement. * Voyager interacts with GPT-4 via blackbox queries, which bypasses the need for model parameter fine-tuning. The skills developed by Voyager are temporally extended, interpretable, and compositional, which compounds the agent's abilities rapidly and alleviates catastrophic forgetting. Empirically, Voyager shows strong in-context lifelong learning capability and exhibits exceptional proficiency in playing Minecraft. It obtains 3.3x more unique items, travels 2.3x longer distances, and unlocks key tech tree milestones up to 15.3x faster than prior SOTA. Voyager is able to utilize the learned skill library in a new Minecraft world to solve novel tasks from scratch, while other techniques struggle to generalize.Voyager discovers new Minecraft items and skills continually by self-driven exploration, significantly outperforming the baselines.* Evaluating Cognitive Maps and Planning in Large Language Models with CogEval (Momennejad et al)* Recently an influx of studies claims emergent cognitive abilities in large language models (LLMs). Yet, most rely on anecdotes, overlook contamination of training sets, or lack systematic Evaluation involving multiple tasks, control conditions, multiple iterations, and statistical robustness tests. Here we make two major contributions. * First, we propose CogEval, a cognitive science-inspired protocol for the systematic evaluation of cognitive capacities in LLMs. The CogEval protocol can be followed for the evaluation of various abilities. * * Second, here we follow CogEval to systematically evaluate cognitive maps and planning ability across eight LLMs (OpenAI GPT-4, GPT-3.5-turbo-175B, davinci-003-175B, Google Bard, Cohere-xlarge-52.4B, Anthropic Claude-1-52B, LLaMA-13B, and Alpaca-7B). We base our task prompts on human experiments, which offer both established construct validity for evaluating planning, and are absent from LLM training sets.* * We find that, while LLMs show apparent competence in a few planning tasks with simpler structures, systematic evaluation reveals striking failure modes in planning tasks, including hallucinations of invalid trajectories and falling in loops. These findings do not support the idea of emergent out-of-the-box planning ability in LLMs. This could be because LLMs do not understand the latent relational structures underlying planning problems, known as cognitive maps, and fail at unrolling goal-directed trajectories based on the underlying structure. Implications for application and future directions are discussed.* Mamba: Linear-Time Sequence Modeling with Selective State Spaces (Albert Gu, Tri Dao)* Foundation models, now powering most of the exciting applications in deep learning, are almost universally based on the Transformer architecture and its core attention module. Many subquadratic-time architectures such as linear attention, gated convolution and recurrent models, and structured state space models (SSMs) have been developed to address Transformers' computational inefficiency on long sequences, but they have not performed as well as attention on important modalities such as language. We identify that a key weakness of such models is their inability to perform content-based reasoning, and make several improvements. * First, simply letting the SSM parameters be functions of the input addresses their weakness with discrete modalities, allowing the model to selectively propagate or forget information along the sequence length dimension depending on the current token. * Second, even though this change prevents the use of efficient convolutions, we design a hardware-aware parallel algorithm in recurrent mode. We integrate these selective SSMs into a simplified end-to-end neural network architecture without attention or even MLP blocks (Mamba). * Mamba enjoys fast inference (5x higher throughput than Transformers) and linear scaling in sequence length, and its performance improves on real data up to million-length sequences. As a general sequence model backbone, Mamba achieves state-of-the-art performance across several modalities such as language, audio, and genomics. On language modeling, our Mamba-1.4B model outperforms Transformers of the same size and matches Transformers twice its size, both in pretraining and downstream evaluation.* Get full access to Latent Space at www.latent.space/subscribe
Mineva teden dramatičnih preobratov in napetih pogajanj. Evropska unija je na prelomnem vrhu po umiku Madžarske odprla pristopna pogajanja z Ukrajino in Moldavijo. V Dubaju so države sveta v podaljšku pogajanj sprejele podnebni sporazum, ki pa ne prinaša odprave, ampak zgolj odmik od uporabe fosilnih goriv. Kdo je torej zmagovalec in kdo poraženec in kaj zaveze sploh pomenijo za naš planet, ki je bil doslej najbolj vroč v zgodovini meritev? Vremenske posledice smo čutili tudi v Sloveniji. Odpravljanje škode po katastrofalnih poplavah naj bi pospešil ta teden sprejeti zakon o obnovi. Zakaj je toliko prahu dvignil tudi interventni zakon v zdravstvu, kaj prinaša nov predlog medijskega zakona in kdo so dobitniki letošnje nagrade Saharov za svobodo misli v kritičnem pregledu tedna s Snežano Ilijaš.
Danes pa nekaj o Roku Snežiču. O Roku Snežiču, največjem davčnem strokovnjaku, kar ga je dala sveta štajerska zemlja, novinarji ne pišejo radi. Ker takoj, ko mu kaj od napisanega ni všeč, Rok Snežič novinarja, redakcijo ali medij toži. Toži tako na gosto, da nekateri mediji po Snežičevih tožbah zapadejo v resne težave. Nam se tožb ni bati, ker je naša medijska hiša že tako ali tako v resnih težavah. Zato se novinarji Snežiča bojijo, sploh pa po onem dnevu, ko se je razglasil za Martina Krpana. Ker pisun s Snežičem že nekako obračuna, z bajeslovnim junakom pa težko. A gremo lepo po vrsti, kot so Snežiči v Banja Luki. Nekaj dni nazaj je bila na mariborskem sodišču razprava, ker je Snežič tožil bosanskega novinarja, ki je pisal o njegovi unikatni tehniki davčnih utaj. Pa se je sojenje kmalu končalo, kajti Snežič menda pozna vse mariborske sodnike – ker je bil z njimi sošolec, ali pa jim je sestavljal davčne napovedi – zato je bosanski novinar zahteval izločitev sodnice. Obe stranki sta se nato na hitro srečali pred vrati sodišča in Snežič je junaško razprl suknjič, na majici pod njim pa je imel natisnjen prizor iz Martina Krpana. In to tistega, kjer junaški Notranjec na kobilici pride pred cesarja, z mesarico in kijem na rami, služabnik v ozadju pa na pladnju nosi oddrobljeno Brdavsovo glavo. Potem je Snežič bosanskega novinarja podučil, da je Krpan obranil Slovenijo in Avstro-Ogrsko pred muslimani, tako da jim je odsekal glavo in da je on tudi Martin Krpan, branitelj Slovenije. In mu na koncu še priporočil, da naj bere literaturo. Prvi problem, ki ga analiziramo v naši skromni literarnozgodovinski in davčnoutajevalski oddaji, je pravne narave. Namreč; Levstikovo povest je risarsko genialno upodobil Tone Kralj. Tone Kralj je tisti slikar, ki je med fašizmom v zasedeni Primorski s svojimi protifašističnimi slikarijami, skritimi v biblijske motive, poslikal okoli petdeset cerkva. Hočemo povedati, da je Tone Kralj z eno samo potezo čopiča za ubranitev Slovenije naredil več, kot bo Rok Snežič naredil v vsem svojem patetično-domoljubnem življenju. Ob tem pa, če vsaj malo poznamo delo, usodo in značaj Toneta Kralja, iskreno dvomimo, da bi mu bilo všeč kopiranje njegove umetnosti na junaške prsi štajerskih pavov. Nenazadnje; v pravu, katerega doktor je Snežič, obstaja pojem, ki se mu avtorske pravice pravi, in kolikor nam je znano, ima Tone Kralj še vedno pravne naslednike. Zdaj pa naprej. Morebiti se zdi samoumevno, da je sodoben slovenski junak Rok Snežič za svoj super herojski alter ego vzel ravno Martina Krpana. Kajti še nekaj imen se zdi primernih na tej polici. Peter Klepec, general Maister in podobno. A rajtamo, da pri Krpanu in Snežiču ne gre samo za superjunaško navezanost. Oba moža sta si namreč edina tudi pri izmikanju davkom. Snežič s svetovanjem, Krpan pa s kontrabantom. A najbolj zanimiv del simbolizma v prizoru Toneta Kralja, kot ga interpretira Snežič, je sekanje glave muslimanom in s tem ubranitev Slovenije in celo Avstro-Ogrske, kot je nevednemu novinarju zabrusil Snežič. Najprej drobna zgodovinska intervencija. Čeprav pri Levstiku nimamo nobene časovne oznake, razen tega, da je bilo pozimi in je v času srečanja med cesarjem in Krpanom naokoli ležal sneg, iz ostalih opisanih zgodovinskih okoliščin upravičeno sklepamo, da se zgodba ni dogajala v času Avstro-Ogrske, temveč bolj verjetno v času Habsburške monarhije. Drobnjakarski smo samo zaradi tega, ker se je modro podučiti o dejstvih, preden nekoga pošlješ brat literaturo, sam pa kot doktor znanosti ne ločiš med temeljnimi zgodovinskimi pojmi … In zdaj k sekanju glav ali dekapitaciji, o kateri smo v 21. stoletju natančno seznanjeni predvsem zaradi delovanja skrajnih islamskih terorističnih skupin. Da je z njo začel prav slovenski narodni junak, ni ravno za okoli razglašati, sploh, ker jo je Brdavsu odsekal počasi – a o tem naj sodi literarna zgodovina. Odtistihmal smo se Slovenci sekanju glav na srečo odrekli in ker sklepamo, da Snežič z ubranitvijo Slovenije pred muslimani misli na begunsko krizo, javno izrekamo podporo bodeči žici – ki se zdi za prebežnike vseeno bolj mila kazen, kot je sekanje glav. Zavoljo iskanja resnice moramo v nadaljevanju natančno popisati stavke, ki jih govori cesar pod Kraljevo podobo, ki si jo je na majico preslikal Rok Snežič. Takole govori habsburški (in ne avstro-ogrski) monarh. »Dam ti, kar želiš, ker si zmogel tolikega sovražnika in otel deželo in mesto velike nadloge in nesreče. Nimam take stvari v cesarstvu, da bi dejal; ne dam ti je, če jo hočeš; celo Jerico, mojo edino hčer, imaš na ponudbo, ako nisi še oženjen.« Na kratko; junak, za katerega se ima Snežič, je pravzaprav plačanec, ki mu je ponujeno vse, kar si zaželi. Cesarica pozneje to ponudbo sicer razredči, a Krpan gre v zahtevah po svojem plačilu v nepričakovano smer; ne zahteva drugega, kot sta pismo in pečat, s katerima bi svoj kontrabant, svoj nelegalni posel, svoje izmikanje davkom, naredil legalnega. Minister Gregor se je nad idejo, da bi urejeno cesarstvo davčnim utajevalcem začelo podarjati bianco menice za njihove rabote, sicer kislo držal, a obveljala je cesarjeva. Krpan je postal legalni švercer in ko Rok Snežič pravi, da je on sam reinkarnacija slovenskega junaka Martina Krpana, ne bi mogel imeti bolj prav.
Croire, c'est faire exister une théorie pour répondre à l'inexplicable. Dans une société qu'on croyait en perte de spiritualité, de nouveaux mouvements mystiques apparaissent. Néochamanisme, néodruidisme, néopaganisme, éco-spiritualité... Ces mouvements, toujours en lien avec les nouveaux enjeux du monde comme le féminisme, l'écologie ou la décroissance se veulent progressistes et connaissent un véritable engouement depuis la crise du Covid. En 2020, le Syndicat national de l'édition (SNE) a constaté une croissance de 13% pour le secteur des ouvrages consacrés à l'ésotérisme. Alors, pourquoi la société produit-elle ces dogmes, aux frontières parfois floues ? Que disent-ils de notre époque ? Avec :• Thierry Jobard, auteur de Je crois donc je suis (Rue de l'échiquier, 2023) • Damien Karbovnik, historien des religions, chercheur au laboratoire ARCHE de l'Université de Strasbourg et enseignant dans le master spiritualités, religions, utopies et mondes symboliques à l'Université Paul Valéry-Montpellier 3• Dodji Amouzouvi, sociologue spécialiste des religions. Professeur à l'université d'Abomey Calavi au Bénin.Programmation musicale :► Danger – Onipa ► Magie - Sofiane Pamart et Yg Pablo.
Croire, c'est faire exister une théorie pour répondre à l'inexplicable. Dans une société qu'on croyait en perte de spiritualité, de nouveaux mouvements mystiques apparaissent. Néochamanisme, néodruidisme, néopaganisme, éco-spiritualité... Ces mouvements, toujours en lien avec les nouveaux enjeux du monde comme le féminisme, l'écologie ou la décroissance se veulent progressistes et connaissent un véritable engouement depuis la crise du Covid. En 2020, le Syndicat national de l'édition (SNE) a constaté une croissance de 13% pour le secteur des ouvrages consacrés à l'ésotérisme. Alors, pourquoi la société produit-elle ces dogmes, aux frontières parfois floues ? Que disent-ils de notre époque ? Avec :• Thierry Jobard, auteur de Je crois donc je suis (Rue de l'échiquier, 2023) • Damien Karbovnik, historien des religions, chercheur au laboratoire ARCHE de l'Université de Strasbourg et enseignant dans le master spiritualités, religions, utopies et mondes symboliques à l'Université Paul Valéry-Montpellier 3• Dodji Amouzouvi, sociologue spécialiste des religions. Professeur à l'université d'Abomey Calavi au Bénin.Programmation musicale :► Danger – Onipa ► Magie - Sofiane Pamart et Yg Pablo.
Meta's Senior Research Director, Dr. Laurens van der Maaten, takes center stage to unravel the captivating realm of AI innovation. Learn about his groundbreaking contributions, including pioneering the t-SNE dimensionality reduction technique and harnessing AI for novel protein synthesis, climate change mitigation, and wearable materials simulation. Join us to explore the transformative power of AI across diverse domains and gain a glimpse into its future societal implications. This episode is brought to you by AWS Inferentia (https://go.aws/3zWS0au), by Modelbit (https://modelbit.com), for deploying models in seconds, and by Grafbase (https://grafbase.com), the unified data layer. Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information. In this episode you will learn: • Large-scale learning of image recognition models on web data [05:05] • Evolutionary Scale Modeling protein models [16:45] • Fighting climate change by building an A.I. model [29:49] • The CrypTen privacy-preserving ML framework [38:36] • Concerns about adversarial examples [53:25] • Laurens' t-SNE algorithm [58:56] • How to make a big impact [1:07:25] Additional materials: www.superdatascience.com/709
Robertova gošća u ovoj epizodi je psihološkinja Snežana Mrvić koja radi na IMZ. U razgovoru ćete čuti zašto je klinički psiholog izuzetno važan u timu koji brine o našem mentalnom zdravlju, šta je baterija testova koju psiholog radi i kako terapija filmom utiče na nas.
AI Today Podcast: Artificial Intelligence Insights, Experts, and Opinion
For a number of reasons, it can be important to reduce the number of variables or identified features in input training data so as to make training machine learning models faster and more accurate. But what are the techniques for doing this? In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Feature Reduction, Principal Component Analysis (PCA), and t-SNE, explain how they relate to AI and why it's important to know about them. Continue reading AI Today Podcast: AI Glossary Series – Feature Reduction, Principal Component Analysis (PCA), and t-SNE at AI & Data Today.
A longtime member of the choir "Sevdalinka" Mrs Snežana Barošević talks about what it means to be in a choir that has existed for three decades, how they manage after the premature departure of the conductor and leader of the choir, my late colleague Amir Bukić and plans and upcoming concerts. - Dugogodišnja članica hora "Sevdalinka", gospođa Snežana Barošević govori šta to znači biti u horu koji postoji već tri decenije, kako se snalaze nakon preranog odlaska dirigenta i vođe hora, mog rahmetli kolege Amira Bukića, te planovima i predstojećim koncertima.