POPULARITY
In this episode of the Product Thinking podcast, host Melissa Perri is joined by Alex Wettrelos Chief Product Officer at Sunday. Join them as they explore the challenges of rapid scaling and then descaling during the 2022 tech-crisis. They discuss how Sunday, a fintech company innovating payment methods across the hospitality industry, went from an idea on a slide deck to 450 employees with $124 million of seed funding. They also touch on the differences between product management in Europe & the US.
Travelnews Online | Rebuilding Travel | Trending | eTurboNews
Minister of Law, Datuk Seri Azalina Othman has said that an online system that allows children to report crimes is in the pipeline for next year so “those who feel ‘more afraid' to make a report in person can do so online. So, in this episode, we discuss the problems children of today face in the digital space, the need for a reporting system like this and how this system should function.Image Credit: Shutterstock
Thousands of high schoolers will sit their NCEA level two English and level one History exams Wednesday morning - but it's not just students feeling the pressure. A glitch in the system for school exams last week left thousands unable to log-in and complete their level one English exam online. It's a problem that's left educators feeling nervous only a couple of hours away from today's exams. Papatoetoe High School principal and president of the Secondary Principals' Association Vaughan Couillaut spoke to Corin Dann.
The online system for school exams faces a test of its own Wednesday morning when thousands of teenagers try to log-in to the level two English exam. Teachers say it is critical there is no repeat of last week's failure when the Qualifications Authority stopped perhaps a couple of thousand students from joining the level one exam because the system wasn't coping with the volume of log-ins. That glitch came just a week after a similar problem with an online literacy and numeracy test. Education correspondent John Gerritsen reports. NZQA has been approached for comment.
Intern verkaufen Wann endet erfolgreicher Vertrieb? Wenn der Auftrag an Land gezogen ist? Oder doch erst später? Reicht es, sein „Auftragsformular“ auszufüllen, den Rest der internen Abwicklung zu überlassen und sich dem nächsten Kunden zuzuwenden? Oder ist dies genau der Moment, an dem Spitzenverkäufer ihren wahren Einfluss geschickt und machtvoll einsetzen sollten, um sich strategische Vorteile zu verschaffen? Welche Rolle spielt taktisches internes Verhandeln innerhalb eines Unternehmens? Fragen, die sich viele Verkäufer stellen. Welche Macht hat interner Vertrieb? Wolfgang S. ist Verkäufer. Ein äußerst erfolgreicher Verkäufer. Er holt einen Auftrag nach dem anderen. Wolfgang kommt von seinen Außendienst-Fahrten regelmäßig mit stolz geschwellter Brust zurück. Dynamisch strebt er auf sein Büro zu und macht seinen Computer an. Er gibt alle relevanten Daten der neu erzielten Aufträge ein, damit der Verkaufs-Innendienst sie übernimmt und abwickelt. Für Wolfgang ist dieser Auftrag damit Geschichte. Er hat das Geschäft eingefädelt, den Rest sollen jetzt andere erledigen. Wolfgang ist geistig bereits beim nächsten Kunden. Er nimmt sich nie die Zeit, mit Kollegen zu plaudern oder sich auszutauschen. Wolfgang sieht sich als Top-Verkäufer, dessen Zeit draußen „an der Verkaufsfront“ am besten investiert ist. Was sich sonst im Unternehmen tut, berührt ihn wenig. „On the road again“, das ist Wolfgangs persönliche Hymne. Seine Kollegen kennen das schon. Jeder erwartet, dass Wolfgang am Ende des Geschäftsjahres den Preis „Erfolgreichster Verkäufer des Jahres“ einheimsen wird. Sehr groß ist die Überraschung, als während des Jahresabschlussmeetings Alexander B. als erfolgreichster Verkäufer des Jahres genannt wird. Wolfgang S. ist fassungslos. Das gibt es doch nicht, dieser Kollege, der scheinbar viel weniger Zeit als er „draußen“ beim Kunden verbringt, hat ihn betreffend des Gesamtumsatzes getoppt? Wie kann das sein? Für Wolfgang bricht eine Welt zusammen. Was hat er falsch gemacht? Top-Verkäufer Müssen Auch Intern Verkaufen, Also Interner Vertrieb Die Erklärung ist sehr einfach. Der „Überflieger“ Wolfgang hat – sich nur auf seine Kunden konzentrierend – völlig übersehen, dass er auch intensive Unterstützung aus dem Unternehmen braucht. Er hat es verabsäumt, ein funktionierendes internes Netzwerk zum Innendienst, zum Versand und zu anderen bedeutsamen Schlüsselstellen aufzubauen. Genau jene Stellen also, die dafür sorgen, dass die bestellten Produkte verlässlich und in hoher Qualität zu seinen Kunden gelangen. Spezielle Lieferzeiten und außergewöhnliche Anforderungen hat er bloß in sein Online-System eingegeben. Darüber auf menschlicher Ebene auch zu sprechen, die Netzwerke aufzubauen und damit intern auch wirksam zu werden mit seinen Projekten und Ideen, hat er vernachlässigt. Wie er sich auch sonst nicht über neue Produkte, deren Entwicklung und interne Abläufe informiert. Ganz anders agiert Alexander B. Er verbringt sehr viel mehr Zeit im Unternehmen und spricht mit verschiedenen Mitarbeitern aus diversen Abteilungen. Er kennt alle Neuigkeiten. Nach jedem großen Auftrag informiert er persönlich seine Kollegen vom Innendienst, erzählt von der Verhandlung, was daran speziell war, was sich der Kunde wünscht, eine besonders kurze Lieferzeit oder sonstigen individuellen Service. Er lässt seine Kollegen an seiner Freude über dieses Geschäft teilhaben und holt sie damit emotional ins Boot. Natürlich wünschen sich die Mitarbeiter von der internen Abwicklung und vom Expedit dann, ihn so gut wie möglich zu unterstützen und machen so manches eigentlich Unmögliche möglich. So geschehen zum Ende des Jahres, als zeitlich nur noch einige wenige Aufträge abgewickelt werden konnten, der Rest musste im Januar geschehen. Ganz instinktiv hatten alle Mitarbeiter die Aufträge von Alexander noch schnell bearbeitet. Sie kannten die Geschichte hinter dem Business und waren froh, ihn unterstützen zu können. So kam es, dass die Aufträge von Wolfgang nicht mehr für dieses Jahr zählten und er nur an zweiter Stelle landete. Es mag etwas unglaublich klingen, läuft aber genau so in vielen Unternehmen täglich ab. Persönliche Beziehungen, aktiver „interner Vertrieb“ sind die Schlüssel zu langfristigem Erfolg. Auch und gerade im Verkauf. Der Verkäufer – Eine Immens Wichtige Schnittstelle: Interner Vertrieb
The online booking system for Parks Victoria has crashed leaving those who used the site concerned of a potential data breach.See omnystudio.com/listener for privacy information.
Mit einem neuen Online-System will die Stadt Essen jetzt Wartezeiten bei der Kfz-Zulassungsstelle verhindern. Die Sportvereine wollen außerdem verstärkt gegen sexualisierte Gewalt im Kinder- und Jugendbereich vorgehen. Und: In Essen ist heute eine junge Mutter verurteilt worden, nachdem sie ihr Baby fast totgeschüttelt hat.
The Mogollon culture was an indigenous culture in the Western United States and Mexico that ranged from New Mexico and Arizona to Sonora, Mexico and out to Texas. They flourished from around 200 CE until the Spanish showed up and claimed their lands. The cultures that pre-existed them date back thousands more years, although archaeology has yet to pinpoint exactly how those evolved. Like many early cultures, they farmed and foraged. As they farmed more, their homes become more permanent and around 800 CE they began to create more durable homes that helped protect them from wild swings in the climate. We call those homes adobes today and the people who lived in those peublos and irrigated water, often moving higher into mountains, we call the Peubloans - or Pueblo Peoples. Adobe homes are similar to those found in ancient cultures in what we call Turkey today. It's an independent evolution. Adobe Creek was once called Arroyo de las Yeguas by the monks from Mission Santa Clara and then renamed to San Antonio Creek by a soldier Juan Prado Mesa when the land around it was given to him by the governor of Alto California at the time, Juan Bautista Alvarado. That's the same Alvarado as the street if you live in the area. The creek runs for over 14 miles north from the Black Mountain and through Palo Alto, California. The ranchers built their adobes close to the creeks. American settlers led the Bear Flag Revolt in 1846, and took over the garrison of Sonoma, establishing the California Republic - which covered much of the lands of the Peubloans. There were only 33 of them at first, but after John Fremont (yes, he of whom that street is named after as well) encouraged the Americans, they raised an army of over 100 men and Fremont helped them march on Sutter's fort, now with the flag of the United States, thanks to Joseph Revere of the US Navy (yes, another street in San Francisco bears his name). James Polk had pushed to expand the United States. Manfiest Destiny. Remember The Alamo. Etc. The fort at Monterey fell, the army marched south. Admiral Sloat got involved. They named a street after him. General Castro surrendered - he got a district named after him. Commodore Stockton announced the US had taken all of Calfironia soon after that. Manifest destiny was nearly complete. He's now basically the patron saint of a city, even if few there know who he was. The forts along the El Camino Real that linked the 21 Spanish Missions, a 600-mile road once walked by their proverbial father, Junípero Serra following the Portolá expedition of 1769, fell. Stockton took each, moving into Los Angeles, then San Diego. Practically all of Alto California fell with few shots. This was nothing like the battles for the independence of Texas, like when Santa Anna reclaimed the Alamo Mission. Meanwhile, the waters of Adobe Creek continued to flow. The creek was renamed in the 1850s after Mesa built an adobe on the site. Adobe Creek it was. Over the next 100 years, the area evolved into a paradise with groves of trees and then groves of technology companies. The story of one begins a little beyond the borders of California. Utah was initialy explored by Francisco Vázquez de Coronado in 1540 and settled by Europeans in search of furs and others who colonized the desert, including those who established the Church of Jesus Christ of Latter-day Saints, or the Mormons - who settled there in 1847, just after the Bear Flag Revolt. The United States officially settled for the territory in 1848 and Utah became a territory and after a number of map changes wher ethe territory got smaller, was finally made a state in 1896. The University of Utah had been founded all the way back in 1850, though - and re-established in the 1860s. 100 years later, the University of Utah was a hotbed of engineers who pioneered a number of graphical advancements in computing. John Warnock went to grad school there and then went on to co-found Adobe and help bring us PostScript. Historically, PS, or Postscript was a message to be placed at the end of a letter, following the signature of the author. The PostScript language was a language to describe a page of text computationally. It was created by Adobe when Warnock, Doug Brotz, Charles Geschke, Bill Paxton (who worked on the Mother of All Demos with Doug Englebart during the development of Online System, or NLS in the late 70s and then at Xerox PARC), and Ed Taft. Warnock invented the Warnock algorithm while working on his PhD and went to work at Evans & Sutherland with Ivan Sutherland who effectively created the field of computer graphics. Geschke got his PhD at Carnegie Melon in the early 1970s and then went of to Xerox PARC. They worked with Paxton at PARC and before long, these PhDs and mathematicians had worked out the algorithms and then the languages to display images on computers while working on InterPress graphics at Xerox and Gerschke left Xerox and started Adobe. Warnock joined them and they went to market with Interpress as PostScript, which became a foundation for the Apple LaswerWriter to print graphics. Not only that, PostScript could be used to define typefaces programmatically and later to display any old image. Those technologies became the foundation for the desktop publishing industry. Apple released the 1984 Mac and other vendors brought in PostScript to describe graphics in their proprietary fashion and by 1991 they released PostScript Level 2 and then PostScript 3 in 1997. Other vendors made their own or furthered standards in their own ways and Adobe could have faded off into the history books of computing. But Adobe didn't create one product, they created an industry and the company they created to support that young industry created more products in that mission. Steve Jobs tried to buy Adobe before that first Mac as released, for $5,000,000. But Warnock and Geschke had a vision for an industry in mind. They had a lot of ideas but development was fairly capital intensive, as were go to market strategies. So they went public on the NASDAQ in 1986. They expanded their PostScript distribution and sold it to companies like Texas Instruments for their laser printer, and other companies who made IBM-compatible companies. They got up to $16 million in sales that year. Warnock's wife was a graphic designer. This is where we see a diversity of ideas help us think about more than math. He saw how she worked and could see a world where Ivan Sutherland's Sketchpad was much more given how far CPUs had come since the TX-0 days at MIT. So Adobe built and released Illustrator in 1987. By 1988 they broke even on sales and it raked in $19 million in revenue. Sales were strong in the universities but PostScript was still the hot product, selling to printer companies, typesetters, and other places were Adobe signed license agreements. At this point, we see where the math, cartesian coordinates, drawn by geometric algorithms put pixels where they should be. But while this was far more efficient than just drawing a dot in a coordinate for larger images, drawing a dot in a pixel location was still the easier technology to understand. They created Adobe Screenline in 1989 and Collectors Edition to create patterns. They listened to graphic designers and built what they heard humans wanted. Photoshop Nearly every graphic designer raves about Adobe Photoshop. That's because Photoshop is the best selling graphics editorial tool that has matured far beyond most other traditional solutions and now has thousands of features that allow users to manipulate images in practically any way they want. Adobe Illustrator was created in 1987 and quickly became the de facto standard in vector-based graphics. Photoshop began life in 1987 as well, when Thomas and John Knoll, wanted to build a simpler tool to create graphics on a computer. Rather than vector graphics they created a raster graphical editor. They made a deal with Barneyscan, a well-known scanner company that managed to distribute over two hundred copies of Photoshop with their scanners and Photoshop became a hit as it was the first editing software people heard about. Vector images are typically generated with Cartesian coordinates based on geometric formulas and so scale out more easily. Raster images are comprised of a grid of dots, or pixels, and can be more realistic. Great products are rewarded with competitions. CorelDRAW was created in 1989 when Michael Bouillon and Pat Beirne built a tool to create vector illustrations. The sales got slim after other competitors entered the market and the Knoll brothers got in touch with Adobe and licensed the product through them. The software was then launched as Adobe Photoshop 1 in 1990. They released Photoshop 2 in 1991. By now they had support for paths, and given that Adobe also made Illustrator, EPS and CMYK rasterization, still a feature in Photoshop. They launched Adobe Photoshop 2.5 in 1993, the first version that could be installed on Windows. This version came with a toolbar for filters and 16-bit channel support. Photoshop 3 came in 1994 and Thomas Knoll created what was probably one of the most important features added, and one that's become a standard in graphical applications since, layers. Now a designer could create a few layers that each had their own elements and hide layers or make layers more transparent. These could separate the subject from the background and led to entire new capabilities, like an almost faux 3 dimensional appearance of graphics.. Then version four in 1996 and this was one of the more widely distributed versions and very stable. They added automation and this was later considered part of becoming a platform - open up a scripting language or subset of a language so others built tools that integrated with or sat on top of those of a product, thus locking people into using products once they automated tasks to increase human efficiency. Adobe Photoshop 5.0 added editable type, or rasterized text. Keep in mind that Adobe owned technology like PostScript and so could bring technology from Illustrator to Photoshop or vice versa, and integrate with other products - like export to PDF by then. They also added a number of undo options, a magnetic lasso, improved color management and it was now a great tool for more advanced designers. Then in 5.5 they added a save for web feature in a sign of the times. They could created vector shapes and continued to improve the user interface. Adobe 5 was also a big jump in complexity. Layers were easy enough to understand, but Photoshop was meant to be a subset of Illustrator features and had become far more than that. So in 2001 they released Photoshop Elements. By now they had a large portfolio of products and Elements was meant to appeal to the original customer base - the ones who were beginners and maybe not professional designers. By now, some people spent 40 or more hours a day in tools like Photoshop and Illustrator. Adobe Today Adobe had released PostScript, Illustrator, and Photoshop. But they have one of the most substantial portfolios of products of any company. They also released Premiere in 1991 to get into video editing. They acquired Aldus Corporation to get into more publishing workflows with PageMaker. They used that acquisition to get into motion graphics with After Effects. They acquired dozens of companies and released their products as well. Adobe also released the PDF format do describe full pages of information (or files that spread across multiple pages) in 1993 and Adobe Acrobat to use those. Acrobat became the de facto standard for page distribution so people didn't have to download fonts to render pages properly. They dabbled in audio editing when they acquired Cool Edit Pro from Syntrillium Software and so now sell Adobe Audition. Adobe's biggest acquisition was Macromedia in 2005. Here, they added a dozen new products to the portfolio, which included Flash, Fireworks, WYSYWIG web editor Dreamweaver, ColdFusion, Flex, and Breeze, which is now called Adobe Connect. By now, they'd also created what we call Creative Suite, which are packages of applications that could be used for given tasks. Creative Suite also signaled a transition into a software as a service, or SaaS mindset. Now customers could pay a monthly fee for a user license rather than buy large software packages each time a new version was released. Adobe had always been a company who made products to create graphics. They expanded into online marketing and web analytics when they bought Omniture in 2009 for $1.8 billion. These products are now normalized into the naming convention used for the rest as Adobe Marketing Cloud. Flash fell by the wayside and so the next wave of acquisitions were for more mobile-oriented products. This began with Day Software and then Nitobi in 2011. And they furthered their Marketing Cloud support with an acquisition of one of the larger competitors when they acquired Marketo in 2018 and acquiring Workfront in 2020. Given how many people started working from home, they also extended their offerings into pure-cloud video tooling with an acquisition of Frame.io in 2021. And here we see a company started by a bunch of true computer sciencists from academia in the early days of the personal computer that has become far more. They could have been rolled into Apple but had a vision of a creative suite of products that could be used to make the world a prettier place. Creative Suite then Creative Cloud shows a move of the same tools into a more online delivery model. Other companies come along to do similar tasks, like infinite digital whiteboard Miro - so they have to innovate to stay marketable. They have to continue to increase sales so they expand into other markets like the most adjacent Marketing Cloud. At 22,500+ employees and with well over $12 billion in revenues, they have a lot of families dependent on maintaining that growth rate. And so the company becomes more than the culmination of their software. They become more than graphic design, web design, video editing, animation, and visual effects. Because in software, if revenues don't grow at a rate greater than 10 percent per year, the company simply isn't outgrowing the size of the market and likely won't be able to justify stock prices at an inflated earnings to price ratio that shows explosive growth. And yet once a company saturates sales in a given market they have shareholders to justify their existence to. Adobe has survived many an economic downturn and boom time with smart, measured growth and is likely to continue doing so for a long time to come.
Gutenburg shipped the first working printing press around 1450 and typeface was born. Before then most books were hand written, often in blackletter calligraphy. And they were expensive. The next few decades saw Nicolas Jensen develop the Roman typeface, Aldus Manutius and Francesco Griffo create the first italic typeface. This represented a period where people were experimenting with making type that would save space. The 1700s saw the start of a focus on readability. William Caslon created the Old Style typeface in 1734. John Baskerville developed Transitional typefaces in 1757. And Firmin Didot and Giambattista Bodoni created two typefaces that would become the modern family of Serif. Then slab Serif, which we now call Antique, came in 1815 ushering in an era of experimenting with using type for larger formats, suitable for advertisements in various printed materials. These were necessary as more presses were printing more books and made possible by new levels of precision in the metal-casting. People started experimenting with various forms of typewriters in the mid-1860s and by the 1920s we got Frederic Goudy, the first real full-time type designer. Before him, it was part of a job. After him, it was a job. And we still use some of the typefaces he crafted, like Copperplate Gothic. And we saw an explosion of new fonts like Times New Roman in 1931. At the time, most typewriters used typefaces on the end of a metal shaft. Hit a kit, the shaft hammers onto a strip of ink and leaves a letter on the page. Kerning, or the space between characters, and letter placement were often there to reduce the chance that those metal hammers jammed. And replacing a font would have meant replacing tons of precision parts. Then came the IBM Selectric typewriter in 1961. Here we saw precision parts that put all those letters on a ball. Hit a key, the ball rotates and presses the ink onto the paper. And the ball could be replaced. A single document could now have multiple fonts without a ton of work. Xerox exploded that same year with the Xerox 914, one of the most successful products of all time. Now, we could type amazing documents with multiple fonts in the same document quickly - and photocopy them. And some of the numbers on those fancy documents were being spat out by those fancy computers, with their tubes. But as computers became transistorized heading into the 60s, it was only a matter of time before we put fonts on computer screens. Here, we initially used bitmaps to render letters onto a screen. By bitmap we mean that a series, or an array of pixels on a screen is a map of bits and where each should be displayed on a screen. We used to call these raster fonts, but the drawback was that to make characters bigger, we needed a whole new map of bits. To go to a bigger screen, we probably needed a whole new map of bits. As people thought about things like bold, underline, italics, guess what - also a new file. But through the 50s, transistor counts weren't nearly high enough to do something different than bitmaps as they rendered very quickly and you know, displays weren't very high quality so who could tell the difference anyways. Whirlwind was the first computer to project real-time graphics on the screen and the characters were simple blocky letters. But as the resolution of screens and the speed of interactivity increased, so did what was possible with drawing glyphs on screens. Rudolf Hell was a German, experimenting with using cathode ray tubes to project a CRT image onto paper that was photosensitive and thus print using CRT. He designed a simple font called Digital Grotesk, in 1968. It looked good on the CRT and the paper. And so that font would not only be used to digitize typesetting, loosely based on Neuzeit Book. And we quickly realized bitmaps weren't efficient to draw fonts to screen and by 1974 moved to outline, or vector, fonts. Here a Bézier curve was drawn onto the screen using an algorithm that created the character, or glyph using an outline and then filling in the space between. These took up less memory and so drew on the screen faster. Those could be defined in an operating system, and were used not only to draw characters but also by some game designers to draw entire screens of information by defining a character as a block and so taking up less memory to do graphics. These were scalable and by 1979 another German, Peter Karow, used spline algorithms wrote Ikarus, software that allowed a person to draw a shape on a screen and rasterize that. Now we could graphically create fonts that were scalable. In the meantime, the team at Xerox PARC had been experimenting with different ways to send pages of content to the first laser printers. Bob Sproull and Bill Newman created the Press format for the Star. But this wasn't incredibly flexible like what Karow would create. John Gaffney who was working with Ivan Sutherland at Evans & Sutherland, had been working with John Warnock on an interpreter that could pull information from a database of graphics. When he went to Xerox, he teamed up with Martin Newell to create J&M, which harnessed the latest chips to process graphics and character type onto printers. As it progressed, they renamed it to Interpress. Chuck Geschke started the Imaging Sciences Laboratory at Xerox PARC and eventually left Xerox with Warnock to start a company called Adobe in Warnock's garage, which they named after a creek behind his house. Bill Paxton had worked on “The Mother of All Demos” with Doug Engelbart at Stanford, where he got his PhD and then moved to Xerox PARC. There he worked on bitmap displays, laser printers, and GUIs - and so he joined Adobe as a co-founder in 1983 and worked on the font algorithms and helped ship a page description language, along with Chuck Geschke, Doug Brotz, and Ed Taft. Steve Jobs tried to buy Adobe in 1982 for $5 million. But instead they sold him just shy of 20% of the company and got a five-year license for PostScript. This allowed them to focus on making the PostScript language more extensible, and creating the Type 1 fonts. These had 2 parts. One that was a set of bit maps And another that was a font file that could be used to send the font to a device. We see this time and time again. The simpler an interface and the more down-market the science gets, the faster we see innovative industries come out of the work done. There were lots of fonts by now. The original 1984 Mac saw Susan Kare work with Jobs and others to ship a bunch of fonts named after cities like Chicago and San Francisco. She would design the fonts on paper and then conjure up the hex (that's hexadecimal) for graphics and fonts. She would then manually type the hexadecimal notation for each letter of each font. Previously, custom fonts were reserved for high end marketing and industrial designers. Apple considered licensing existing fonts but decided to go their own route. She painstakingly created new fonts and gave them the names of towns along train stops around Philadelphia where she grew up. Steve Jobs went for the city approach but insisted they be cool cities. And so the Chicago, Monaco, New York, Cairo, Toronto, Venice, Geneva, and Los Angeles fonts were born - with her personally developing Geneva, Chicago, and Cairo. And she did it in 9 x 7. I can still remember the magic of sitting down at a computer with a graphical interface for the first time. I remember opening MacPaint and changing between the fonts, marveling at the typefaces. I'd certainly seen different fonts in books. But never had I made a document and been able to set my own typeface! Not only that they could be in italics, outline, and bold. Those were all her. And she inspired a whole generation of innovation. Here, we see a clean line from Ivan Sutherland and the pioneering work done at MIT to the University of Utah to Stanford through the oNLine System (or NLS) to Xerox PARC and then to Apple. But with the rise of Windows and other graphical operating systems. As Apple's 5 year license for PostScript came and went they started developing their own font standard as a competitor to Adobe, which they called TrueType. Here we saw Times Roman, Courier, and symbols that could replace the PostScript fonts and updating to Geneva, Monaco, and others. They may not have gotten along with Microsoft, but they licensed TrueType to them nonetheless to make sure it was more widely adopted. And in exchange they got a license for TrueImage, which was a page description language that was compatible with PostScript. Given how high resolution screens had gotten it was time for the birth of anti-aliasing. He we could clean up the blocky “jaggies” as the gamers call them. Vertical and horizontal lines in the 8-bit era looked fine but distorted at higher resolutions and so spatial anti-aliasing and then post-processing anti-aliasing was born. By the 90s, Adobe was looking for the answer to TrueImage. So 1993 brought us PDF, now an international standard in ISO 32000-1:2008. But PDF Reader and other tools were good to Adobe for many years, along with Illustrator and then Photoshop and then the other products in the Adobe portfolio. By this time, even though Steve Jobs was gone, Apple was hard at work on new font technology that resulted in Apple Advanced Typography, or AAT. AAT gave us ligature control, better kerning and the ability to write characters on different axes. But even though Jobs was gone, negotiations between Apple and Microsoft broke down to license AAT to Microsoft. They were bitter competitors and Windows 95 wasn't even out yet. So Microsoft started work on OpenType, their own font standardized language in 1994 and Adobe joined the project to ship the next generation in 1997. And that would evolve into an open standard by the mid-2000s. And once an open standard, sometimes the de facto standard as opposed to those that need to be licensed. By then the web had become a thing. Early browsers and the wars between them to increment features meant developers had to build and test on potentially 4 or 5 different computers and often be frustrated by the results. So the WC3 began standardizing how a lot of elements worked in Extensible Markup Language, or XML. Images, layouts, colors, even fonts. SVGs are XML-based vector image. In other words the browser interprets a language that displays the image. That became a way to render Web Open Format or WOFF 1 was published in 2009 with contributions by Dutch educator Erik van Blokland, Jonathan Kew, and Tal Leming. This built on the CSS font styling rules that had shipped in Internet Explorer 4 and would slowly be added to every browser shipped, including Firefox since 3.6, Chrome since 6.0, Internet Explorer since 9, and Apple's Safari since 5.1. Then WOFF 2 added Brotli compression to get sizes down and render faster. WOFF has been a part of the W3C open web standard since 2011. Out of Apple's TrueType came TrueType GX, which added variable fonts. Here, a single font file could contain a number or range of variants to the initial font. So a family of fonts could be in a single file. OpenType added variable fonts in 2016, with Apple, Microsoft, and Google all announcing support. And of course the company that had been there since the beginning, Adobe, jumped on board as well. Fewer font files, faster page loads. So here we've looked at the progression of fonts from the printing press, becoming more efficient to conserve paper, through the advent of the electronic typewriter to the early bitmap fonts for screens to the vectorization led by Adobe into the Mac then Windows. We also see rethinking the font entirely so multiple scripts and character sets and axes can be represented and rendered efficiently. I am now converting all my user names into pig Latin for maximum security. Luckily those are character sets that are pretty widely supported. The ability to add color to pig Latin means that OpenType-SVG will allow me add spiffy color to my glyphs. It makes us wonder what's next for fonts. Maybe being able to design our own, or more to the point, customize those developed by others to make them our own. We didn't touch on emoji yet. But we'll just have to save the evolution of character sets and emoji for another day. In the meantime, let's think on the fact that fonts are such a big deal because Steve Jobs took a caligraphy class from a Trappist monk named Robert Palladino while enrolled at Reed College. Today we can painstakingly choose just the right font with just the right meaning because Palladino left the monastic life to marry and have a son. He taught jobs about serif and san serif and kerning and the art of typography. That style and attention to detail was one aspect of the original Mac that taught the world that computers could have style and grace as well. It's not hard to imagine if entire computers still only supported one font or even one font per document. Palladino never owned or used a computer though. His influence can be felt through the influence his pupil Jobs had. And it's actually amazing how many people who had such dramatic impacts on computing never really used one. Because so many smaller evolutions came after them. What evolutions do we see on the horizon today? And how many who put a snippet of code on a service like GitHub may never know the impact they have on so many?
(3/8/23) - In today's Federal Newscast: GAO promises more details about TSP's new website. Lawmakers look to ban agency use of Biometric Technology. And the Army's new advertising slogan is an old one. Learn more about your ad choices. Visit megaphone.fm/adchoices
AP correspondent Norman Hall reports: Asylum Ban-Exemptions
Guest: Panyaza LesufiSee omnystudio.com/listener for privacy information.
The Federal Retirement Thrift Investment Board rolled out a major update for the Thrift Savings Plan on June 1. That includes a new interface for the online “My Account” website, a mobile app and new investment options for mutual funds. But some TSP participants are expressing their frustrations with technical issues, bugs in the system and much more. Joining the Federal Drive with the details, Federal News Network's Drew Friedman.
Guest: Martin Mafojane, Chief Master of High Court See omnystudio.com/listener for privacy information.
Java, Ruby, PHP, Go. These are web applications that dynamically generate code then interpreted as a file by a web browser. That file is rarely static these days and the power of the web is that an app or browser can reach out and obtain some data, get back some xml or json or yaml, and provide an experience to a computer, mobile device, or even embedded system. The web is arguably the most powerful, transformational technology in the history of technology. But the story of the web begins in philosophies that far predate its inception. It goes back to a file, which we can think of as a document, on a computer that another computer reaches out to and interprets. A file comprised of hypertext. Ted Nelson coined the term hypertext. Plenty of others put the concepts of linking objects into the mainstream of computing. But he coined the term that he's barely connected to in the minds of many. Why is that? Tim Berners-Lee invented the World Wide Web in 1989. Elizabeth Feinler developed a registry of names that would evolve into DNS so we could find computers online and so access those web sites without typing in impossible to remember numbers. Bob Kahn and Leonard Kleinrock were instrumental in the Internet Protocol, which allowed all those computers to be connected together, providing the schemes for those numbers. Some will know these names; most will not. But a name that probably doesn't come up enough is Ted Nelson. His tale is one of brilliance and the early days of computing and the spread of BASIC and an urge to do more. It's a tale of the hacker ethic. And yet, it's also a tale of irreverence - to be used as a warning for those with aspirations to be remembered for something great. Or is it? Steve Jobs famously said “real artists ship.” Ted Nelson did ship. Until he didn't. Let's go all the way back to 1960, when he started Project Xanadu. Actually, let's go a little further back first. Nelson was born to TV directory Ralph Nelson and Celeste Holm, who won an Academy Award for her role in Gentleman's Agreement in 1947 and took home another pair of nominations through her career, and for being the original Ado Annie in Oklahoma. His dad worked on The Twilight Zone - so of course he majored in philosophy at Swarthmore College and then went off to the University of Chicago and then Harvard for graduate school, taking a stab at film after he graduated. But he was meant for an industry that didn't exist yet but would some day eclipse the film industry: software. While in school he got exposed to computers and started to think about this idea of a repository of all the world's knowledge. And it's easy to imagine a group of computing aficionados sitting in a drum circle, smoking whatever they were smoking, and having their minds blown by that very concept. And yet, it's hard to imagine anyone in that context doing much more. And yet he did. Nelson created Project Xanadu in 1960. As we'll cover, he did a lot of projects during the remainder of his career. The Journey is what is so important, even if we never get to the destination. Because sometimes we influence the people who get there. And the history of technology is as much about failed or incomplete evolutions as it is about those that become ubiquitous. It began with a project while he was enrolled in Harvard grad school. Other word processors were at the dawn of their existence. But he began thinking through and influencing how they would handle information storage and retrieval. Xanadu was supposed to be a computer network that connected humans to one another. It was supposed to be simple and a scheme for world-wide electronic publishing. Unlike the web, which would come nearly three decades later, it was supposed to be bilateral, with broken links self-repairing, much as nodes on the ARPAnet did. His initial proposal was a program in machine language that could store and display documents. Being before the advent of Markdown, ePub, XML, PDF, RTF, or any of the other common open formats we use today, it was rudimentary and would evolve over time. Keep in mind. It was for documents and as Nelson would say later, the web - which began as a document tool, was a fork of the project. The term Xanadu was borrowed from Samuel Taylor Coleridge's Kubla Khan, itself written after some opium fueled dreams about a garden in Kublai Khan's Shangdu, or Xanadu.In his biography, Coleridge explained the rivers in the poem supply “a natural connection to the parts and unity to the whole” and he said a “stream, traced from its source in the hills among the yellow-red moss and conical glass-shaped tufts of bent, to the first break or fall, where its drops become audible, and it begins to form a channel.” Connecting all the things was the goal and so Xanadu was the name. He gave a talk and presented a paper called “A File Structure for the Complex, the Changing and the Indeterminate” at the Association for Computing Machinery in 1965 that laid out his vision. This was the dawn of interactivity in computing. Digital Equipment had launched just a few years earlier and brought the PDP-8 to market that same year. The smell of change was in the air and Nelson was right there. After that, he started to see all these developments around the world. He worked on a project at Brown University to develop a word processor with many of his ideas in it. But the output of that project, as with most word processors since - was to get things printed. He believed content was meant to be created and live its entire lifecycle in the digital form. This would provide perfect forward and reverse citations, text enrichment, and change management. And maybe if we all stand on the shoulders of giants, it would allow us the ability to avoid rewriting or paraphrasing the works of others to include them in own own writings. We could do more without that tedious regurgitation. He furthered his counter-culture credentials by going to Woodstock in 1969. Probably not for that reason, but it happened nonetheless. And he traveled and worked with more and more people and companies, learning and engaging and enriching his ideas. And then he shared them. Computer Lib/Dream Machines was a paperback book. Or two. It had a cover on each side. Originally published in 1974, it was one of the most important texts of the computer revolution. Steven Levy called it an epic. It's rare to find it for less than a hundred bucks on eBay at this point because of how influential it was and what an amazing snapshot in time it represents. Xanadu was to be a hypertext publishing system in the form of Xanadocs, or files that could be linked to from other files. A Xanadoc used Xanalinks to embed content from other documents into a given document. These spans of text would become transclusions and change in the document that included the content when they changed in the live document. The iterations towards working code were slow and the years ticked by. That talk in 1965 gave way to the 1970s, then 80s. Some thought him brilliant. Others didn't know what to make of it all. But many knew of his ideas for hypertext and once known it became deterministic. Byte Magazine published many of his thoughts in 1988 called “Managing Immense Storage” and by then the personal computer revolution had come in full force. Tim Berners-Lee put the first node of the World Wide Web online the next year, using a protocol they called Hypertext Transfer Protocol, or http. Yes, the hypertext philosophy was almost a means of paying homage to the hard work and deep thinking Nelson had put in over the decades. But not everyone saw it as though Nelson had made great contributions to computing. “The Curse of Xanadu” was an article published in Wired Magazine in 1995. In the article, the author points out the fact that the web had come along using many of the ideas Nelson and his teams had worked on over the years but actually shipped - whereas Nelson hadn't. Once shipped, the web rose in popularity becoming the ubiquitous technology it is today. The article looked at Xanadu as vaporware. But there is a deeper, much more important meaning to Xanadu in the history of computing. Perhaps inspired by the Wired article, the group released an incomplete version of Xanadu in 1998. But by then, other formats - including PDF which was invented in 1993 and .doc for Microsoft Word, were the primary mechanisms we stored documents and first gopher and then the web were spreading to interconnect humans with content. https://www.youtube.com/watch?v=72M5kcnAL-4 The Xanadu story isn't a tragedy. Would we have had hypertext as a part of Douglas Engelbart's oNLine System without it? Would we have object-oriented programming or later the World Wide Web without it? The very word hypertext is almost an homage, even if they don't know it, to Nelson's work. And the look and feel of his work lives on in places like GitHub, whether directly influenced or not, where we can see changes in code side-by-side with actual production code, changes that are stored and perhaps rolled back forever. Larry Tessler coined the term Cut and Paste. While Nelson calls him a friend in Werner Herzog's Lo and Behold, Reveries of the Connected World, he also points out that Tessler's term is flawed. And I think this is where we as technologists have to sometimes trim down our expectations of how fast evolutions occur. We take tiny steps because as humans we can't keep pace with the rapid rate of technological change. We can look back and see a two steps forward and one step back approach since the dawn of written history. Nelson still doesn't think the metaphors that harken back to paper have any place in the online written word. Here's another important trend in the history of computing. As we've transitioned to more and more content living online exclusively, the content has become diluted. One publisher I wrote online pieces for asked that they all be +/- 700 words and asked that paragraphs be no more than 4 sentences long (preferably 3) and the sentences should be written at about a 5th or 6th grade level. Maybe Nelson would claim that this de-evolution of writing is due to search engine optimization gamifying the entirety of human knowledge and that a tool like Xanadu would have been the fix. After all, if we could borrow the great works of others we wouldn't have to paraphrase them. But I think as with most things, it's much more nuanced than that. Our always online, always connected brains can only accept smaller snippets. So that's what we gravitate towards. Actually, we have plenty of capacity for whatever we actually choose to immerse ourselves into. But we have more options than ever before and we of course immerse ourselves into video games or other less literary pursuits. Or are they more literary? Some generations thought books to be dangerous. As do all oppressors. So who am I to judge where people choose to acquire knowledge or what kind they indulge themselves in. Knowledge is power and I'm just happy they have it. And they have it in part because others were willing to water own the concepts to ship a product. Because the history of technology is about evolutions, not revolutions. And those often take generations. And Nelson is responsible for some of the evolutions that brought us the ht in http or html. And for that we are truly grateful! As with the great journey from Lord of the Rings, rarely is greatness found alone. The Xanadu adventuring party included Cal Daniels, Roger Gregory, Mark Miller, Stuart Greene, Dean Tribble, Ravi Pandya, became a part of Autodesk in the 80s, got rewritten in Smalltalk, was considered a rival to the web, but really is more of an evolutionary step on that journey. If anything it's a divergence then convergence to and from Vannevar Bush's Memex. So let me ask this as a parting thought? Are the places you are not willing to sacrifice any of your core designs or beliefs worth the price being paid? Are they worth someone else ending up with a place in the history books where (like with this podcast) we oversimplify complex topics to make them digestible? Sometimes it's worth it. In no way am I in a place to judge the choices of others. Only history can really do that - but when it happens it's usually an oversimplification anyways… So the building blocks of the web lie in irreverence - in hypertext. And while some grew out of irreverence and diluted their vision after an event like Woodstock, others like Nelson and his friend Douglas Englebart forged on. And their visions didn't come with commercial success. But as an integral building block to the modern connected world today they represent as great a mind as practically anyone else in computing.
Hallo Kam semua! Apa kabar? Apakah kam sudah bisa beradaptasi dengan dunia online? Belajar, kerja yang serba online. Nahhh di eps 02 dari podcast #maritongcerita ini ada beberapa aplikasi yang bisa bantu tong tetap produktif untuk belajar dan kerja secara online.
Apple found massive success on the back of the Apple II. They went public like many of the late 70s computer companies and the story could have ended there, as it did for many computer companies of the era who were potentially bigger, had better technology, better go to market strategies, and/or even some who were far more innovative. But it didn't. The journey to the next stage began with the Apple IIc, Apple IIgs, and other incrementally better, faster, or smaller models. Those funded the research and development of a number of projects. One was a new computer: the Lisa. I bet you thought we were jumping into the Mac next. Getting there. But twists and turns, as the title suggests. The success of the Apple II led to many of the best and brightest minds in computers wanting to go work at Apple. Jobs came to be considered a visionary. The pressure to actually become one has been the fall of many a leader. And Jobs almost succumbed to it as well. Some go down due to a lack of vision, others because they don't have the capacity for executional excellence. Some lack lieutenants they can trust. The story isn't clear with Jobs. He famously sought perfection. And sometimes he got close. The Xerox Palo Alto Research Center, or PARC for short, had been a focal point of raw research and development, since 1970. They inherited many great innovations, outlandish ideas, amazing talent, and decades of research from academia and Cold War-inspired government grants. Ever since Sputnik, the National Science Foundation and the US Advanced Research Projects Agency had funded raw research. During Vietnam, that funding dried up and private industry moved in to take products to market. Arthur Rock had come into Xerox in 1969, on the back of an investment into Scientific Data Systems. While on the board of Xerox, he got to see the advancements being made at PARC. PARC hired some of the oNLine System (NLS) team who worked to help ship the Xerox Alto in 1973, shipping a couple thousand computers. They followed that up with the Xerox Star in 1981, selling about 20,000. But PARC had been at it the whole time, inventing all kinds of goodness. And so always thinking of the next computer, Apple started the Lisa project in 1978, the year after the release of the Apple II, when profits were just starting to roll in. Story has it that Steve Jobs secured a visit to PARC and made out the back with the idea for a windowing personal computer GUI complete with a desktop metaphor. But not so fast. Apple had already begun the Lisa and Macintosh projects before Jobs visited Xerox. And after the Alto was shown off internally at Xerox in 1977, complete with Mother of All Demo-esque theatrics on stages using remote computers. They had the GUI, the mouse, and networking - while the other computers released that year, the Apple II, Commodore, and TRS-80 were still doing what Dartmouth, the University of Illinois, and others had been doing since the 60s - just at home instead of on time sharing computers. In other words, enough people in computing had seen the oNLine System from Stanford. The graphical interface was coming and wouldn't be stopped. The mouse had been written about in scholarly journals. But it was all pretty expensive. The visits to PARC, and hiring some of the engineers, helped the teams at Apple figure out some of the problems they didn't even know they had. They helped make things better and they helped the team get there a little quicker. But by then the coming evolution in computing was inevitable. Still, the Xerox Star was considered a failure. But Apple said “hold my beer” and got to work on a project that would become the Lisa. It started off simply enough: some ideas from Apple executives like Steve Jobs and then 10 people, led by Ken Rothmuller, to develop a system with windows and a mouse. Rothmuller got replaced with John Couch, Apple's 54th employee. Trip Hawkins got a great education in marketing on that team. He would later found Electronic Arts, one of the biggest video game publishers in the world. Larry Tesler from the Stanford AI Lab and then Xerox PARC joined the team to run the system software team. He'd been on ARPANet since writing Pub an early markup language and was instrumental in the Gypsy Word Processor, Smalltalk, and inventing copy and paste. Makes you feel small to think of some of this stuff. Bruce Daniels, one of the Zork creators from MIT, joined the team from HP as the software manager. Wayne Rosing, formerly of Digital and Data General, was brought in to design the hardware. He'd later lead the Sparc team and then become a VP of Engineering at Google. The team grew. They brought in Bill Dresselhaus as a principal product designer for the look and use and design and even packaging. They started with a user interface and then created the hardware and applications. Eventually there would be nearly 100 people working on the Lisa project and it would run over $150 million in R&D. After 4 years, they were still facing delays and while Jobs had been becoming more and more involved, he was removed from the project. The personal accounts I've heard seem to be closer to other large out of control projects at companies that I've seen though. The Apple II used that MOS 6502 chip. And life was good. The Lisa used the Motorola 68000 at 5 MHz. This was a new architecture to replace the 6800. It was time to go 32-bit. The Lisa was supposed to ship with between 1 and 2 megabytes of RAM. It had a built-in 12 inch screen that was 720 x 364. They got to work building applications, releasing LisaWrite, LisaCalc, LisaDraw, LisaGraph, LisaGuide, LisaList, LisaProject, and LisaTerminal. They translated it to British English, French, German, Italian, and Spanish. All the pieces were starting to fall into place. But the project kept growing. And delays. Jobs got booted from the Lisa project amidst concerns it was bloated, behind schedule, wasting company resources, and that Jobs' perfectionism was going to result in a product that could never ship. The cost of the machine was over $10,000. Thing is, as we'll get into later, every project went over budget and ran into delays for the next decade. Great ideas could then be capitalized on by others - even if a bit watered down. Some projects need to teach us how not to do projects - improve our institutional knowledge about the project or product discipline. That didn't exactly happen with Lisa. We see times in the history of computing and technology for that matter, when a product is just too far advanced for its time. That would be the Xerox Alto. As costs come down, we can then bring ideas to a larger market. That should have been the Lisa. But it wasn't. While nearly half the cost of a Xerox Star, less than half the number of units were sold. Following the release of the Lisa, we got other desktop metaphors and graphical interfaces. Agat out of the Soviet Union, SGI, Visi (makers of Visicalc), GEM from Digital Research, DeskMate from Tandy, Amiga Intuition, Acorn Master Compact, the Arthur for the ARM, and the initial releases of Microsoft Windows. By the late 1980s the graphical interface was ubiquitous and computers were the easiest to use for the novice than they'd ever been before. But developers didn't flock to the system as they'd done with the Apple II. You needed a specialized development workstation so why would they? People didn't understand the menuing system yet. As someone who's written command line tools, sometimes they're just easier than burying buttons in complicated graphical interfaces. “I'm not dead yet… just… badly burned. Or sick, as it were.” Apple released the Lisa 2 in 1984. It went for about half the price and was a little more stable. One reason was that the Twiggy disk drives Apple built for the Lisa were replaced with Sony microfloppy drives. This looked much more like what we'd get with the Mac, only with expansion slots. The end of the Lisa project was more of a fizzle. After the original Mac was released, Lisa shipped as the Macintosh XL, for $4,000. Sun Remarketing built MacWorks to emulate the Macintosh environment and that became the main application of the Macintosh XL. Sun Remarketing bought 5,000 of the Mac XLs and improved them somewhat. The last of the 2,700 Lisa computers were buried in a landfill in Utah in 1989. As the whole project had been, they ended up being a write-off. Apple traded them out for a deep discount on the Macintosh Plus. By then, Steve Jobs was long gone, Apple was all about the Mac and the next year General Magic would begin ushering in the era of mobile devices. The Lisa was a technical marvel at the time and a critical step in the evolution of the desktop metaphor, then nearly twenty years old, beginning at Stanford on NASA and ARPA grants, evolving further at PARC when members of the team went there, and continuing on at Apple. The lessons learned in the Lisa project were immense and helped inform the evolution of the next project, the Mac. But might the product have actually gained traction in the market if Steve Jobs had not been telling people within Apple and outside that the Mac was the next thing, while the Apple II line was still accounting for most of the revenue of the company? There's really no way to tell. The Mac used a newer Motorola 68000 at nearly 8 megahertz so was faster, the OS was cleaner, the machine was prettier. It was smaller, boxier like the newer Japanese cars at the time. It was just better. But it probably couldn't have been if not for the Lisa. Lisa was slower than it was supposed to be. The operating system tended to be fragile. There were recalls. Steve Jobs was never afraid to cannibalize a product to make the next awesome thing. He did so with Lisa. If we step back and look at the Lisa as an R&D project, it was a resounding success. But as a public company, the shareholders didn't see it that way at the time. So next time there's an R&D project running amuck, think about this. The Lisa changed the world, ushering in the era of the graphical interface. All for the low cost of $50 million after sales of the device are taken out of it. But they had to start anew with the Mac and only bring in the parts that worked. They built out too much technical debt while developing the product to do anything else. While it can be painful - sometimes it's best to start with a fresh circuit board and a blank command line editor. Then we can truly step back and figure out how we want to change the world.
Pippa speaks to the Mayoral Committee Member for Safety and Security at the City of Cape Town. See omnystudio.com/listener for privacy information.
As Washington state tries to move along its Covid vaccination program, it has launched an online questionnaire to help people figure out whether they’re eligible for the next round.
Georg Merkscha und Philipp Puregger sind die Gründer von RealTalk, einer der größten österreichischen Eventreihen über Persönlichkeitsentwicklung. Georg und Philipp haben mit ihrem Team und den externen Partnern ihr erstes Produkt veröffentlicht, die Mutbox. Eine Box bestehend aus 52 Aufgaben, Zitaten und einem Online System, welches Dir eine Challenge für jede Woche bietet, um Dich selbst Schrittweise aus der Komfortzone zu bringen. Du lernst von ihnen, was Mut überhaupt ist. Was die Komfortzone ist, dass sie gute Seiten hat, aber warum man sich nicht ständig darin befinden darf. Du lernst warum Challenges eine der besten Möglichkeiten sind um Deine eigene Komfortzone zu verlassen und zu erweitern. Du lernst, wie sehr Mut auch mit einem Sicherheitsbedürfnis verbunden ist, warum es sich nix bringt über Dinge zu jammern über die Du keinen Einfluss hast, und so vieles mehr. ------ Danke an Coworking zu Geidorf für das Sponsoring der heutigen Episode https://coworking-geidorf.at/ ------ Unterstütze bestimmt kreativ auf Steady https://steadyhq.com/de/bestimmtkreativpodcast ------ Bücher https://www.bestimmtkreativ.com/buecher ------ bestimmt kreativ #10 - mit RealTalk: https://bestimmtkreativ.com/realtalk/ ------ RealTalk im Internet Mutbox bestellen: https://www.realtalk.at/produkt/mutbox/ Website: https://www.realtalk.at/ Blog: https://www.realtalk.at/blog/ Podcast: https://www.realtalk.at/podcast/ facebook: https://www.facebook.com/realtalkaustria/ Instagram: https://www.instagram.com/realtalkaustria/ ---------- Werde Teil des bestimmt kreativen Kollektivs auf Instagram: https://www.instagram.com/bestimmtkreativ facebook: http://facebook.com/bestimmtkreativ ---------- Schick' WhatsApp Sprachnachrichten an das bestimmt kreative Telefon +43 677 63 711 875 ---------- Abonniere bestimmt kreativ jetzt auf Apple Podcasts: https://podcasts.apple.com/at/podcast/bestimmt-kreativ/id1522274165 Spotify: https://open.spotify.com/show/7sghLSgSIfBWncGrd1MJKV?si=braxB_-bR9uxllVLprzt5Q YouTube: https://www.youtube.com/channel/UCtEeUPkxIKaq2KbDzRJfgLw Jeder Podcast App Deiner Wahl
6 Figures For The Rest Of Us – https://www.marketingsharks.com/6-figures-for-the-rest-of-us-special-coaching-dfy-make-money-online-system/ LET’S RECAP EVERYTHING YOU ARE RECEIVING TODAY: Chads Personal custom plugins (the plugins that generate his $1,000 weekly paycheck) ($2,997 value) Our step by step training (Chad loves to spoon feed) ($3,997 value) A one-on-one training chat with Chad ($1,000 value) 24/7 LIFETIME access to our Coaching Group ($4,997 value) Our ‘everything you need to GUARANTEED success or Chad will personally work with you one on one till you’re successful guarantee.’ ($3,997 value) 6 Figures For The Rest Of Us – https://www.marketingsharks.com/6-figures-for-the-rest-of-us-special-coaching-dfy-make-money-online-system/
On this episode we talk about my Brand New Make Money Online System That You Can Use Absolutely Free To Help You Make Money Online. You Can Get Free Access At https://TheVipSystem.com This Show is brought to you by: http://www.TheFreeFunnel.com A Free Sales Funnel System to help you maximize your earnings. Just follow the simple steps and you'll have an automated online business with me, LJ Aviles, doing a lot of the work for you. Check Out My #1 Recommended Way to Make Money Online and get a FREE Complimentary Vacation Stay of up to 7 nights on me: http://www.LJAviles.com/?src=Podcast See the Video Podcast and Subscribe to my YouTube Channel: http://www.LifeCoachLJ.com/YouTube Get episode show notes and other shows here: http://www.LifeCoachLJ.com/Podcast Send me a Facebook Message and let me know what you think: http://m.me/LifeCoachLJ Connect with me on your fav social platform and send me a message telling me what you like best about the show. http://www.Facebook.com/LifeCoachLJ http://www.Instagram.com/LifeCoachLJ http://www.LifeCoachLJ.com/YouTube http://www.Twitter.com/LifeCoachLJ --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app Support this podcast: https://anchor.fm/lifecoachlj/support
Online System for Temple Prayer Rolls
How much is your silence worth? Could Mercedes-Benz buy yours for $500? Or would you leak evidence of their frankly grubby behaviour to someone such as me? In this report, let’s see what’s behind door number two. Save thousands on any new car (Australia-only): https://autoexpert.com.au/contact ACCC complaint process: https://www.accc.gov.au/consumers/complaints-problems/make-a-consumer-complaint AutoExpert discount roadside assistance package: https://247roadservices.com.au/autoexpert/ Did you like this report? You can help support the channel, securely via PayPal: https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=DSL9A3MWEMNBW&source=url Philip Lakic made a mistake in 2017. He bought a Mercedes-Benz A-Class. No - I don’t know why either. It’s kinda like getting hitched on the spur of the moment, in Las Vegas. This story is like a fling with Tiffany from the office. Everything was new and exciting for a little while, and then Three-pronged suppository Death Star management turned off their outdated Windows-based connectivity tech, which they had talked up heavily in the brochure, calling it COMMAND Online. “I thought you might be interested in an update from Mercedes-Benz Australia Pacific RE COMAND Online MB Apps failures. I submitted a letter of demand for a fix or a refund, which was not met (surprise), but did get back this pitiful letter of offer from MBAuP, which I rejected.” - Philip Lakic I cannot be the only person here seeing the massive disconnect between the company’s position publicly, and the one taking place under the table, metaphorically. And let’s not forget that Mr Lakic’s A-Class is only three years old. It’s hardly a relic. I am also - frankly - gobsmacked that Daimler and Microsoft (two of the world’s biggest corporations) were unwilling to get together and come up with a wireless browsing system that would endure for - I dunno - 10 years of operation. I’m further gobsmacked that their disregard for current owners is so profound that they simply couldn’t be arsed coming up with a patch for this problem. Last word here to Mr Lakic: "I will now start pursuing a course of action through the ACCC and if that fails, I'll have to start lining some lawyer's pockets. I truly expected more from Mercedes-Benz as our family bought three AMG vehicles which were collectively worth approximately $300k when bought new." "I had to experience that Mercedes-Benz customer service feeling for myself, before I truly realised how shit Mercedes-Benz really are. In the end you were right. I wish you weren't, but you are. I should have bought a BMW. I hope that your YouTube channel serves to educate more people on the practices of brands within the Australian Car industry." I did not elicit that commentary from Mr Lakic - it just lobbed without warning in my inbox, as these things often do. I did not go trawling for endorsement, nor did I seek brand-trashing experiences upon which to get my rocks off. I do not recommend Mercedes-Benz, and Mr Lakic’s unsolicited experience is emblematic of why. Buying Mercedes-Benz is like entering into a dysfunctional relationship with a profound power imbalance. It seems to me that Mercedes-Benz customers - at least in Australia - care somewhat more about the brand than the brand fundamentally cares about them.
A geekfest on the new online system coming to take and track permits and other paperwork from development projects across the county. Our guest is planning and development director John Zeanah.
In a world of rapidly changing technologies, few have lasted as long is as unaltered a fashion as the mouse. The party line is that the computer mouse was invente d by Douglas Engelbart in 1964 and that it was a one-button wooden device that had two metal wheels. Those used an analog to digital conversion to input a location to a computer. But there's a lot more to tell. Englebart had read an article in 1945 called “As We May Think” by Vannevar Bush. He was in the Philippines working as a radio and radar tech. He'd return home,. Get his degree in electrical engineering, then go to Berkeley and get first his masters and then a PhD. Still in electrical engineering. At the time there were a lot of military grants in computing floating around and a Navy grant saw him work on a computer called CALDIC, short for the California Digital Computer. By the time he completed his PhD he was ready to start a computer storage company but ended up at the Stanford Research Institute in 1957. He published a paper in 1962 called Augmenting Human Intellect: A Conceptual Framework. That paper would guide the next decade of his life and help shape nearly everything in computing that came after. Keeping with the theme of “As We May Think” Englebart was all about supplementing what humans could do. The world of computer science had been interested in selecting things on a computer graphically for some time. And Englebart would have a number of devices that he wanted to test in order to find the best possible device for humans to augment their capabilities using a computer. He knew he wanted a graphical system and wanted to be deliberate about every aspect in a very academic fashion. And a key aspect was how people that used the system would interact with it. The keyboard was already a mainstay but he wanted people pointing at things on a screen. While Englebart would invent the mouse, pointing devices certainly weren't new. Pilots had been using the joystick for some time, but an electrical joystick had been developed at the US Naval Research Laboratory in 1926, with the concept of unmanned aircraft in mind. The Germans would end up building one in 1944 as well. But it was Alan Kotok who brought the joystick to the computer game in the early 1960s to play spacewar on minicomputers. And Ralph Baer brought it into homes in 1967 for an early video game system, the Magnavox Odyssey. Another input device that had come along was the trackball. Ralph Benjamin of the British Royal Navy's Scientific Service invented the trackball, or ball tracker for radar plotting on the Comprehensive Display System, or CDS. The computers were analog at the time but they could still use the X-Y coordinates from the trackball, which they patented in 1947. Tom Cranston, Fred Longstaff and Kenyon Taylor had seen the CDS trackball and used that as the primary input for DATAR, a radar-driven battlefield visualization computer. The trackball stayed in radar systems into the 60s, when Orbit Instrument Corporation made the X-Y Ball Tracker and then Telefunken turned it upside down to control the TR 440, making an early mouse type of device. The last of the options Englebart decided against was the light pen. Light guns had shown up in the 1930s when engineers realized that a vacuum tube was light-sensitive. You could shoot a beam of light at a tube and it could react. Robert Everett worked with Jay Forrester to develop the light pen, which would allow people to interact with a CRT using light sensing to cause an interrupt on a computer. This would move to the SAGE computer system from there and eek into the IBM mainframes in the 60s. While the technology used to track the coordinates is not even remotely similar, think of this as conceptually similar to the styluses used with tablets and on Wacom tablets today. Paul Morris Fitts had built a model in 1954, now known as Fitts's Law, to predict the time that's required to move things on a screen. He defined the target area as a function of the ratio between the distance to the target and the width of the target. If you listen to enough episodes of this podcast, you'll hear a few names repeatedly. One of those is Claude Shannon. He brought a lot of the math to computing in the 40s and 50s and helped with the Shannon-Hartley Theorum, which defined information transmission rates over a given medium. So these were the main options at Englebart's disposal to test when he started ARC. But in looking at them, he had another idea. He'd sketched out the mouse in 1961 while sitting in a conference session about computer graphics. Once he had funding he brought in Bill English to build a prototype I n 1963. The first model used two perpendicular wheels attached to potentiometers that tracked movement. It had one button to select things on a screen. It tracked x,y coordinates as had previous devices. NASA funded a study to really dig in and decide which was the best device. He, Bill English, and an extremely talented team, spent two years researching the question, publishing a report in 1965. They really had the blinders off, too. They looked at the DEC Grafacon, joysticks, light pens and even what amounts to a mouse that was knee operated. Two years of what we'd call UX research or User Research today. Few organizations would dedicate that much time to study something. But the result would be patenting the mouse in 1967, an innovation that would last for over 50 years. I've heard Engelbart criticized for taking so long to build the oNline System, or NLS, which he showcased at the Mother of All Demos. But it's worth thinking of his research as academic in nature. It was government funded. And it changed the world. His paper on Computer-Aided Display Controls was seminal. Vietnam caused a lot of those government funded contracts to dry up. From there, Bill English and a number of others from Stanford Research Institute which ARC was a part of, moved to Xerox PARC. English and Jack Hawley iterated and improved the technology of the mouse, ditching the analog to digital converters and over the next few years we'd see some of the most substantial advancements in computing. By 1981, Xerox had shipped the Alto and the Star. But while Xerox would be profitable with their basic research, they would miss something that a candle-clad hippy wouldn't. In 1979, Xerox let Steve Jobs make three trips to PARC in exchange for the opportunity to buy 100,000 shares of Apple stock pre-IPO. The mouse by then had evolved to a three button mouse that cost $300. It didn't roll well and had to be used on pretty specific surfaces. Jobs would call Dean Hovey, a co-founder of IDEO and demand they design one that would work on anything including quote “blue jeans.” Oh, and he wanted it to cost $15. And he wanted it to have just one button, which would be an Apple hallmark for the next 30ish years. Hovey-Kelley would move to optical encoder wheels, freeing the tracking ball to move however it needed to and then use injection molded frames. And thus make the mouse affordable. It's amazing what can happen when you combine all that user research and academic rigor from Englebarts team and engineering advancements documented at Xerox PARC with world-class industrial design. You see this trend played out over and over with the innovations in computing that are built to last. The mouse would ship with the LISA and then with the 1984 Mac. Logitech had shipped a mouse in 1982 for $300. After leaving Xerox, Jack Howley founded a company to sell a mouse for $400 the same year. Microsoft released a mouse for $200 in 1983. But Apple changed the world when Steve Jobs demanded the mouse ship with all Macs. The IBM PC would ;use a mouse and from there it would become ubiquitous in personal computing. Desktops would ship with a mouse. Laptops would have a funny little button that could be used as a mouse when the actual mouse was unavailable. The mouse would ship with extra buttons that could be mapped to additional workflows or macros. And even servers were then outfitted with switches that allowed using a device that switched the keyboard, video, and mouse between them during the rise of large server farms to run the upcoming dot com revolution. Trays would be put into most racks with a single u, or unit of the rack being used to see what you're working on; especially after Windows or windowing servers started to ship. As various technologies matured, other innovations came along to input devices. The mouse would go optical in 1980 and ship with early Xerox Star computers but what we think of as an optical mouse wouldn't really ship until 1999 when Microsoft released the IntelliMouse. Some of that tech came to them via Hewlett-Packard through the HP acquisition of DEC and some of those same Digital Research Institute engineers had been brought in from the original mainstreamer of the mouse, PARC when Bob Taylor started DRI. The LED sensor on the muse stuck around. And thus ended the era of the mouse pad, once a hallmark of many a marketing give-away. Finger tracking devices came along in 1969 but were far too expensive to produce at the time. As capacitive sensitive pads, or trackpads came down in price and the technology matured those began to replace the previous mouse-types of devices. The 1982 Apollo computers were the first to ship with a touchpad but it wasn't until Synaptics launched the TouchPad in 1992 that they began to become common, showing up in 1995 on Apple laptops and then becoming ubiquitous over the coming years. In fact, the IBM Thinkpad and many others shipped laptops with little red nubs in the keyboard for people that didn't want to use the TouchPad for awhile as well. Some advancements in the mouse didn't work out. Apple released the hockey puck shaped mouse in 1998, when they released the iMac. It was USB, which replaced the ADB interface. USB lasted. The shape of the mouse didn't. Apple would go to the monolithic surface mouse in 2000, go wireless in 2003 and then release the Mighty Mouse in 2005. The Mighty Mouse would have a capacitive touch sensor and since people wanted to hear a click would produce that with a little speaker. This also signified the beginning of bluetooth as a means of connecting a mouse. Laptops began to replace desktops for many, and so the mouse itself isn't as dominant today. And with mobile and tablet computing, resistive touchscreens rose to replace many uses for the mouse. But even today, when I edit these podcasts, I often switch over to a mouse simply because other means of dragging around timelines simply aren't as graceful. And using a pen, as Englebart's research from the 60s indicated, simply gets fatiguing. Whether it's always obvious, we have an underlying story we're often trying to tell with each of these episodes. We obviously love unbridled innovation and a relentless drive towards a technologically utopian multiverse. But taking a step back during that process and researching what people want means less work and faster adoption. Doug Englebart was a lot of things but one net-new point we'd like to make is that he was possibly the most innovative in harnessing user research to make sure that his innovations would last for decades to come. Today, we'd love to research every button and heat map and track eyeballs. But remembering, as he did, that our job is to augment human intellect, is best done when we make our advances useful, helps to keep us and the forks that occur in technology from us, from having to backtrack decades of work in order to take the next jump forward. We believe in the reach of your innovations. So next time you're working on a project. Save yourself time, save your code a little cyclomatic complexity, , and save users frustration from having to relearn a whole new thing. And research what you're going to do first. Because you never know. Something you engineer might end up being touched by nearly every human on the planet the way the mouse has. Thank you Englebart. And thank you to NASA and Bob Roberts from ARPA for funding such important research. And thank you to Xerox PARC, for carrying the torch. And to Steve Jobs for making the mouse accessible to every day humans. As with many an advance in computing, there are a lot of people that deserve a little bit of the credit. And thank you listeners, for joining us for another episode of the history of computing podcast. We're so lucky to have you. Now stop consuming content and go change the world.
Explore how to manage a centralized admission process (CAP) along with online counseling of the students for multiple institutes and thousands of students. Online System works seamlessly to simplify processes for students, college, registrar, university, admission committee. For More Details Visit
Explore how to manage a centralized admission process (CAP) along with online counseling of the students for multiple institutes and thousands of students. Online System works seamlessly to simplify processes for students, college, registrar, university, admission committee.
Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we're going to cover a special moment in time. Picture this if you will. It's 1968. A collection of some 1,000 of the finest minds in computing is sitting in the audience of the San Francisco Civic Center. They're at a joint conference of the Association for Computing Machinery and the IEEE or the Institute of Electrical and Electronics Engineers Fall Join Computer Conference in San Francisco. They're waiting to see the a session called A research center for augmenting human intellect. Many had read Vannevar Bush's “As We May Think” Atlantic article in 1946 that signified the turning point that inspired so many achievements over the previous 20 years. Many had witnessed the evolution from the mainframe to the transistorized computer to timesharing systems. The presenter for this session would be Douglas Carl Engelbart. ARPA had strongly recommended he come to finally make a public appearance. Director Bob Taylor in fact was somewhat adamant about it. The talk was six years in the making and ARPA and NASA were ready to see what they had been investing in. ARPA had funded his Augmentation Research Center Lab in SRI, or the Stanford Research Institute. The grad instigator J.C.R. Licklider had started the funding when ARPA was still called DARPA in 1963 based on a paper Engelbart published in 1962. But it had really been going since Engelbart got married in 1950 and realized computers could be used to improve human capabilities, to harness the collective intellect, to facilitate truly interactive computing and to ultimately make the world a better place. Englebart was 25. He'd been from Oregon where he got his Bachelors in 48 after serving in World War II as a radar tech. He then come to Berkely in 53 for is Masters, sating through 1955 to get his PhD. He ended up at Stanford's SRI. There, he hired people like Don Andrews, Bill Paxton, Bill English, and Jeff Rulifson. And today Engelbart was ready to show the world what his team had been working on. The computer was called the oNLine System, or NLS. Bill English would direct things onsite. Because check this out, not all presenters were onsite on that day in 1968. Instead, some were at ARC in Menlo Park, 30 miles away. To be able to communicate onsite they used two 1200 baud modems connecting over a leased line to their office. But they would also use two microwave links. And that was for something crazy: video. The lights went dark. The OnLine Computer was projected onto a 22 foot high screen using an Eidophor video projector. Bill English would flip the screen up as the lights dimmed. The audience was expecting a tall, thin man to come out to present. Instead, they saw Doug Englebart on the screen in front of them. The one behind the camera, filming Engelbart, was Stewart Russel Brand, the infamous editor of the Whole Earth Catalog. It seems Englebart was involved in more than just computers. But people destined to change the world have always travelled in the same circles I supposed. Englebart's face came up on the screen, streaming in from all those miles away. And the screen they would switch back and forth to. That was the Online System, or NLS for short. The camera would come in from above Englebart's back and the video would be transposed with the text being entered on the screen. This was already crazy. But when you could see where he was typing, there was something… well, extra. He was using a pointing device in his right hand. This was the first demo of a computer mouse Which he had applied for a patent for in 1967. He called it that because it had a tail which was the cabe that connected the wooden contraption to the computer. Light pens had been used up to this point, but it was the first demonstration of a mouse and the team had actually considered mounting it under the desk and using a knee to move the pointer.But they decided that would just be too big a gap for normal people to imagine and that the mouse would be simpler. Engelbart also used a device we might think of more like a macro pad today. It was modeled after piano keys. We'd later move this type of functionality onto the keyboard using various keystrokes, F keys, and a keyboard and in the case of Apple, command keys. He then opened a document on his screen. Now, people didn't do a lot of document editing in 1968. Really, computers were pretty much used for math at that point. At least, until that day. That document he opened. He used hyperlinks to access content. That was the first real demo of clickable hypertext. He also copied text in the document. And one of the most amazing aspects of the presentation was that you kinda' felt like he was only giving you a small peak into what he had. You see, before the demo, they thought he was crazy. Many were probably only there to see a colossal failure of a demo. But instead they saw pure magic. Inspiration. Innovation. They saw text highlighted. They saw windows on screens that could be resized. They saw the power of computer networking. Video conferencing. A stoic Engelbart was clearly pleased with his creation. Bill Paxton and Jeff Rulifson were on the other side, helping with some of the text work. His style worked well with the audience, and of course, it's easy to win over an audience when they have just been wowed by your tech. But more than that, his inspiration was so inspiring that you could feel it just watching the videos. All these decades later. can watching those videos. Engelbart and the team would receive a standing ovation. And to show it wasn't smoke and mirrors, ARC let people actually touch the systems and Engelbart took questions. Many people involved would later look back as though it was an unfinished work. And it was. Andy van Dam would later say Everybody was blown away and thought it was absolutely fantastic and nothing else happened. There was almost no further impact. People thought it was too far out and they were still working on their physical teletypes, hadn't even migrated to glass teletypes yet. But that's not really fair or telling the whole story. In 1969 we got the Mansfield Amendment - which slashed the military funding pure scientific research. After that, the budget was cut and the team began to disperse, as was happening with a lot of the government-backed research centers. Xerox was lucky enough to hire Bob Taylor, and many others immigrated to Xerox PARC, or Palo Alto Research Center, was able to take the concept and actually ship a device in 1973, although not as mass marketable yet as later devices would be. Xerox would ship the Alto in 1973. The Alto would be the machine that inspired the Mac and therefore Windows - so his ideas live on today. His own team got spun out of Stanford and sold, becoming Tymshare and then McDonnel Douglas. He continued to have more ideas but his concepts were rarely implemented at McDonnel Douglas so he finally left in 1986, starting the Bootstrapp Alliance, which he founded with his daughter. But he succeeded. He wanted to improve the plight of man and he did. Hypertext and movable screens directly influenced a young Alan Kay who was in the audience and was inspired to write Smalltalk. The Alto at Xerox also inspired Andy van Dam, who built the FRESS hypertext system based on many of the concepts from the talk as well. It also did multiple windows, version control on documents, intradocument hypertext linking, and more. But, it was hard to use. Users needed to know complex commands just to get into the GUI screens. He was also still really into minicomputers and timesharing, and kinda' missed that the microcomputer revolution was about to hit hard. The hardware hacker movement that was going on all over the country, but most concentrated in the Bay Area, was about to start the long process of putting a computer, and now mobile device, in every home in the world. WIth smaller and smaller and faster chips, the era of the microcomputer would transition into the era of the client and server. And that was the research we were transitioning to as we moved into the 80s. Charles Irby was a presentter as well, being a designer of NLS. He would go on to lead the user interface design work on the Xerox star before founding a company then moving on to VP of development for General Magic, a senior leader at SGI and then the leader of the engineering team that developed the Nintendo 64. Bob Sproull was in the audience watching all this and would go on to help design the Xerox Alto, the first laser printer, and write the Principles of Interactive Computer Graphics before becoming a professor at Conegie Mellon and then ending up helping create Sun Microsystems Laboratories, becoming the director and helping design asuynchronous processors. Butler Lampson was also there, a found of Xerox PARC, where the Alto was built and co-creator of Ethernet. Bill Paxton (not the actor) would join him at PARC and later go on to be an early founder of Adobe. In 2000, Engelbart would receive the National Medal of Technology for his work. He also He got the Turing Award in 1997, the Locelace Medal in 2001. He would never lose his belief in the collective intelligence. He wrote Boosting Our Collective IQ in 1995 and it has Englebart passed away in 2013. He will forever be known as the inventor of the mouse. But he gave us more. He wanted to augment the capabilities of humans, allowing us to do more, rather than replace us with machines. This was in contrast to SAIL and the MIT AI Lab where they were just doing research for the sake of research. The video of his talk is on YouTube, so click on the links in the show notes if you'd like to access it and learn more about such a great innovator. He may not have brought a mass produced system to market, but as with Vanevar Bush's article 20 years before, the research done is a turning point in history; a considerable milestone on the path to the gleaming world we now live in today. The NLS teaches us that while you might not achieve commercial success with years of research, if you are truly innovative, you might just change the world. Sometimes the two simply aren't mutually exclusive. And when you're working on a government grant, they really don't have to be. So until next time, dare to be bold. Dare to change the world, and thank you for tuning in to yet another episode of the History of Computing Podcast. We're so lucky to have you. Have a great day! https://www.youtube.com/watch?v=yJDv-zdhzMY
This is the best way I know to take control of your freelance career and your clients.
Wann endet erfolgreicher Vertrieb? Wenn der Auftrag an Land gezogen ist? Oder doch erst später? Reicht es, sein „Auftragsformular“ auszufüllen, den Rest der internen Abwicklung zu überlassen und sich dem nächsten Kunden zuzuwenden? Oder ist dies genau der Moment, an dem Spitzenverkäufer ihren wahren Einfluss geschickt und machtvoll einsetzen sollten, um sich strategische Vorteile zu verschaffen? Welche Rolle spielt taktisches internes Verhandeln innerhalb eines Unternehmens? Fragen, die sich viele Verkäufer stellen. Welche Macht hat interner Vertrieb? Autorin: Ulrike Knauer Mehr über Knauer Training – Verkaufstraining Wolfgang S. ist Verkäufer. Ein äußerst erfolgreicher Verkäufer. Er holt einen Auftrag nach dem anderen. Wolfgang kommt von seinen Außendienst-Fahrten regelmäßig mit stolz geschwellter Brust zurück. Dynamisch strebt er auf sein Büro zu und macht seinen Computer an. Er gibt alle relevanten Daten der neu erzielten Aufträge ein, damit der Verkaufs-Innendienst sie übernimmt und abwickelt. Für Wolfgang ist dieser Auftrag damit Geschichte. Er hat das Geschäft eingefädelt, den Rest sollen jetzt andere erledigen. Wolfgang ist geistig bereits beim nächsten Kunden. Er nimmt sich nie die Zeit, mit Kollegen zu plaudern oder sich auszutauschen. Wolfgang sieht sich als Top-Verkäufer, dessen Zeit draußen „an der Verkaufsfront“ am besten investiert ist. Was sich sonst im Unternehmen tut, berührt ihn wenig. „On the road again“, das ist Wolfgangs persönliche Hymne. Seine Kollegen kennen das schon. Jeder erwartet, dass Wolfgang am Ende des Geschäftsjahres den Preis „Erfolgreichster Verkäufer des Jahres“ einheimsen wird. Sehr groß ist die Überraschung, als während des Jahresabschlussmeetings Alexander B. als erfolgreichster Verkäufer des Jahres genannt wird. Wolfgang S. ist fassungslos. Das gibt es doch nicht, dieser Kollege, der scheinbar viel weniger Zeit als er „draußen“ beim Kunden verbringt, hat ihn betreffend des Gesamtumsatzes getoppt? Wie kann das sein? Für Wolfgang bricht eine Welt zusammen. Was hat er falsch gemacht? Top-Verkäufer müssen auch intern verkaufen, also interner Vertrieb Die Erklärung ist sehr einfach. Der „Überflieger“ Wolfgang hat – sich nur auf seine Kunden konzentrierend – völlig übersehen, dass er auch intensive Unterstützung aus dem Unternehmen braucht. Er hat es verabsäumt, ein funktionierendes internes Netzwerk zum Innendienst, zum Versand und zu anderen bedeutsamen Schlüsselstellen aufzubauen. Genau jene Stellen also, die dafür sorgen, dass die bestellten Produkte verlässlich und in hoher Qualität zu seinen Kunden gelangen. Spezielle Lieferzeiten und außergewöhnliche Anforderungen hat er bloß in sein Online-System eingegeben. Darüber auf menschlicher Ebene auch zu sprechen, die Netzwerke aufzubauen und damit intern auch wirksam zu werden mit seinen Projekten und Ideen, hat er vernachlässigt. Wie er sich auch sonst nicht über neue Produkte, deren Entwicklung und interne Abläufe informiert. Ganz anders agiert Alexander B. Er verbringt sehr viel mehr Zeit im Unternehmen und spricht mit verschiedenen Mitarbeitern aus diversen Abteilungen. Er kennt alle Neuigkeiten. Nach jedem großen Auftrag informiert er persönlich seine Kollegen vom Innendienst, erzählt von der Verhandlung, was daran speziell war, was sich der Kunde wünscht, eine besonders kurze Lieferzeit oder sonstigen individuellen Service. Er lässt seine Kollegen an seiner Freude über dieses Geschäft teilhaben und holt sie damit emotional ins Boot. Natürlich wünschen sich die Mitarbeiter von der internen Abwicklung und vom Expedit dann, ihn so gut wie möglich zu unterstützen und machen so manches eigentlich Unmögliche möglich. So geschehen zum Ende des Jahres, als zeitlich nur noch einige wenige Aufträge abgewickelt werden konnten, der Rest musste im Januar geschehen. Ganz instinktiv hatten alle Mitarbeiter die Aufträge von Alexander noch schnell bearbeitet. Sie kannten die Geschichte hinter dem Business und waren froh, ihn unterstützen zu können. So kam es, dass die Aufträge von Wolfgang nicht mehr für dieses Jahr zählten und er nur an zweiter Stelle landete. Es mag etwas unglaublich klingen, läuft aber genau so in vielen Unternehmen täglich ab. Persönliche Beziehungen, aktiver „interner Vertrieb“ sind die Schlüssel zu langfristigem Erfolg. Auch und gerade im Verkauf. Der Verkäufer – eine immens wichtige Schnittstelle: interner Vertrieb Spitzenverkäufer repräsentieren nicht nur gegenüber dem Kunden, sondern auch im eigenen Haus. Sie kümmern sich um Produkte, deren Qualität, deren Weiterentwicklung und vor allem um den Service anderer Unternehmensbereiche, also außerhalb der eigenen Vertriebsunit. Wer sich nur auf seine äußere Wirkung, das kommunikative und verkäuferische Geschick nach außen verlässt, dieses jedoch nicht nach innen ausdehnt, kann den Titel „Top-Verkäufer“ nicht für sich in Anspruch nehmen. Nur wer bereichsübergreifend denkt, seinen Einfluss gegenüber dem Kunden und dem Markt geltend macht und geschickt eine Brücke baut zwischen allen Unternehmenseinheiten und dem Kunden, wird auf Dauer erfolgreich sein. Also interner Vertrieb als Erfolgsfaktor. Viele Verkäufer sind sich der Macht dieser Schnittstellenfunktion, die letztlich ja auch Informationsvorsprung gegenüber allen Beteiligten bedeutet, allerdings oft gar nicht bewusst. Dabei gilt: Nur sie hören direkt die Wünsche ihrer Kunden und nur sie können diese ins Unternehmen hineintragen. Geben die Verkäufer dieses Wissen zum Beispiel nicht weiter, gehen wertvolle Informationen für immer verloren.
In this episode, Troy Howard, in partnership with his father and brother, Ron and Trevor Howard, at SoTellUs.com, discusses the world's only review platform that lets businesses instantly collect video, audio and written reviews from customers using an app on a smart phone or tablet. The entire review takes 30 seconds or less to collect.Troy noted that every review is verified in seconds by SoTellUs, and if it is a valid review, it is automatically marketed online for the business through their website and social media sites. This new system of automated, verified reviews is changing the review world.Troy goes on to say that social proof is the most important element of all of today’s marketing strategies. People believe more about what others have to say about a business than what a business has to say about itself. Online reviews transform and build businesses like nothing else.Troy Howard's passion for combining technology and marketing has been a driving force in his life, leading him from selling yellow page ads to an executive of a publicly-traded company, and ultimately as an entrepreneur and champion of small businesses. Having the foresight to see the change the Internet would play on marketing, Troy left the yellow page industry to work as a project manager for YP. Com, one of the largest online directories producing nearly $100 million in revenue annually. Over the next few years Troy moved up in the company to Vice President of Operations and Marketing managing a 300-person sales force that generated over 8,000 new clients each month.Eventually Troy left the corporate world and started several successful marketing companies including WebForce Pro and onTop local, which have helped thousands of small business owners around the world dominate their competition online and drive record growth in their businesses by disrupting the idea of traditional search engine optimization. Troy and his team made this possible by developing proprietary marketing software that reduced 3 - 5 hours of SEO work into a click of a button, giving their customers the competitive edge online.Troy is so confident that his review system will help any business that he has offered a 30-day free trial to listeners. Visit this link for the free trial: https://webforcepro.isrefer.com/go/sale/jhelfer. Once on the site, click “Get Started” at the top, which will take you to the order form. Complete the order form, and in the promo code section, enter 30free and Apply. That will remove the charge and setup fee and give listeners 30 days free to try out this system and start getting reviews.Main Street Mavericks Radio with Joel Helferhttp://businessinnovatorsradio.com/main-street-mavericks-radio-with-joel-helfer/
In this episode, Troy Howard, in partnership with his father and brother, Ron and Trevor Howard, at SoTellUs.com, discusses the world's only review platform that lets businesses instantly collect video, audio and written reviews from customers using an app on a smart phone or tablet. The entire review takes 30 seconds or less to collect.Troy noted that every review is verified in seconds by SoTellUs, and if it is a valid review, it is automatically marketed online for the business through their website and social media sites. This new system of automated, verified reviews is changing the review world.Troy goes on to say that social proof is the most important element of all of today’s marketing strategies. People believe more about what others have to say about a business than what a business has to say about itself. Online reviews transform and build businesses like nothing else.Troy Howard's passion for combining technology and marketing has been a driving force in his life, leading him from selling yellow page ads to an executive of a publicly-traded company, and ultimately as an entrepreneur and champion of small businesses. Having the foresight to see the change the Internet would play on marketing, Troy left the yellow page industry to work as a project manager for YP. Com, one of the largest online directories producing nearly $100 million in revenue annually. Over the next few years Troy moved up in the company to Vice President of Operations and Marketing managing a 300-person sales force that generated over 8,000 new clients each month.Eventually Troy left the corporate world and started several successful marketing companies including WebForce Pro and onTop local, which have helped thousands of small business owners around the world dominate their competition online and drive record growth in their businesses by disrupting the idea of traditional search engine optimization. Troy and his team made this possible by developing proprietary marketing software that reduced 3 - 5 hours of SEO work into a click of a button, giving their customers the competitive edge online.Troy is so confident that his review system will help any business that he has offered a 30-day free trial to listeners. Visit this link for the free trial: https://webforcepro.isrefer.com/go/sale/jhelfer. Once on the site, click “Get Started” at the top, which will take you to the order form. Complete the order form, and in the promo code section, enter 30free and Apply. That will remove the charge and setup fee and give listeners 30 days free to try out this system and start getting reviews.Main Street Mavericks Radio with Joel Helferhttp://businessinnovatorsradio.com/main-street-mavericks-radio-with-joel-helfer/
Recapitulando la versión anterior, los piratas hicieron un update de la no tan positiva situación del voto electrónico en el país. Pero para no dejar de lado a los clásicos personajes, nos presentaron a Douglas Engelbart, el tipo que inventó el mouse. Ingeniero e inventor, impulsó el vínculo entre las capacidades intuitivas de la mente humana y las capacidades de procesamiento de las computadoras. The Mother of All Demos, uno de sus grandes hitos. Para cerrar, le dijeron adiós y hasta siempre a uno de los mejores trackers privados para descargar torrents.
In network marketing and considering a "system" for your teammates? This will help