{"id":2120,"date":"2017-03-31T15:20:39","date_gmt":"2017-03-31T13:20:39","guid":{"rendered":"https:\/\/blog.mi.hdm-stuttgart.de\/?p=2120"},"modified":"2023-08-06T21:50:52","modified_gmt":"2023-08-06T19:50:52","slug":"livestreaming-with-libav_-tutorial-part-1","status":"publish","type":"post","link":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/2017\/03\/31\/livestreaming-with-libav_-tutorial-part-1\/","title":{"rendered":"Livestreaming with libav* &#8211; Tutorial (Part 1)"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone\" src=\"https:\/\/upload.wikimedia.org\/wikipedia\/commons\/a\/af\/Green_screen_live_streaming_production_at_Mediehuset_K%C3%B8benhavn.jpg\" alt=\"Green screen live streaming production at Mediehuset K\u00f8benhavn. Author: Rehak\" width=\"1920\" height=\"1080\"><\/p>\n<p>Lifestreaming is the real deal of video today, however&nbsp;there aren&#8217;t that many content creation tools to choose from.&nbsp;YouTube, Facebook and Twitter are pushing hard to enable their users to stream vlogging-style content live from their phones with proprietary Apps, and <a href=\"https:\/\/obsproject.com\/\">OBS<\/a> is used for Let&#8217;s Plays and Twitch streams. But when you want to stream events or lectures you are pretty much on your own.<\/p>\n<p>In this series&nbsp;of posts I want to share the&nbsp;experiences I gained over the past couple of weeks while writing an application that captures video and audio and creates a simple livestream. This application is designed to be the basis of a simple streaming desktop application.&nbsp;<!--more-->This series&nbsp;is also supposed to help people to&nbsp;better understand the ffmpeg libraries for creating videos. While there are some great tutorials on how to build video players, almost no one troubles himself on writing tutorials for creating\/encoding videos.<br \/>\nThe&nbsp;posts are&nbsp;intended for people who already have a little&nbsp;experience with video creation. The first part of the series&nbsp;will be some background stuff, the second post will mainly focus on building an applications with the ffmpeg-libraries.<\/p>\n<p>Assuming that you want&nbsp;to embed your livestream in a web page and play it with common browsers one of the first things to do as a content creator is to think&nbsp;about which codecs and container formats to use. This greatly influences the experience of the users.<\/p>\n<p>Not all operating systems and Browsers support the same set of techniques. Therefore I want to give a very quick overview on the most common codecs and containers. This overview is mainly based on playback support for the codec rather than quality issues because this will be your first concern when starting with lifestreaming.&nbsp;<a title=\"Ronald S. Bultje\" href=\"https:\/\/blogs.gnome.org\/rbultje\/\" target=\"_blank\" rel=\"home noopener\">Ronald S. Bultje<\/a>&nbsp;provided an awesome quality and performance oriented<a href=\"https:\/\/blogs.gnome.org\/rbultje\/2015\/09\/28\/vp9-encodingdecoding-performance-vs-hevch-264\/\" target=\"_blank\" rel=\"noopener\">&nbsp;comparison of modern&nbsp;video codecs<\/a>.<\/p>\n<h2>Video and audio formats<\/h2>\n<p>First of all you have to look at your users: Which operating systems and devices do they use?<\/p>\n<p>While on PC all codecs and containers you can think of are probably supported in some library, it is a very different story on mobile devices.<\/p>\n<p>Because video playback (decoding) uses a lot of computing power and therefore drains your battery, mobile operating systems try to use hardware acceleration as much as possible. But this also means that not all codecs and containers are supported for livestreaming. Here is a support list of the most common operating systems and browsers.<\/p>\n<pre class=\"prettyprint\" data-start-line=\"1\" data-visibility=\"visible\" data-highlight=\"\" data-caption=\"\">                           vp8\/9+webm  h264+mp4\/DASH   h264+mkv\n                            \nChrome (+Opera, Vivaldi...)     x               x           -\nFirefox                         x               x           -\nSafari                          -               x           -\nEdge                            vp8             x           -\niOS (all browsers)              -               x           -\nAndroid (Chrome)                x               x           -<\/pre>\n<p>Sources:&nbsp;<a href=\"http:\/\/caniuse.com\/#search=video\" target=\"_blank\" rel=\"noopener\">caniuse.com<\/a>, <a href=\"https:\/\/developer.mozilla.org\/en-US\/docs\/Web\/HTML\/Supported_media_formats\" target=\"_blank\" rel=\"noopener\">MDN<\/a>, <a href=\"https:\/\/developer.android.com\/guide\/topics\/media\/media-formats.html\" target=\"_blank\" rel=\"noopener\">Android developer guide<\/a>, <a href=\"https:\/\/developer.apple.com\/library\/content\/documentation\/Miscellaneous\/Conceptual\/iPhoneOSTechOverview\/MediaLayer\/MediaLayer.html\" target=\"_blank\" rel=\"noopener\">iOS developer guide<\/a>, <a href=\"https:\/\/msdn.microsoft.com\/en-us\/library\/mt599587(v=vs.85).aspx\" target=\"_blank\" rel=\"noopener\">Microsoft developer guide<\/a><\/p>\n<p>Even on PC support for hardware accelerated video playback is important to think about, since it relieves the CPU of a lot of work and lets the PC run quieter.<\/p>\n<p>When you want to livestream to mobile devices and the most common browsers on PC, there are basically three options to go with:<\/p>\n<h3>h264 + rtmp<\/h3>\n<p>With <a href=\"http:\/\/www.itu.int\/rec\/T-REC-H.264\" target=\"_blank\" rel=\"noopener\">h264<\/a> being an older codec it is now well supported on almost all devices, most of the time even with hardware acceleration.&nbsp;On lower bitrates you get the characteristic block artefacts, but all in all this codec offers goodish performance and efficiency.&nbsp;If you want to archive or postprocess your videos, the h264\/5 is a good basis. Keep in mind that the use of both h264&nbsp;and h265&nbsp;may be associated with license costs if you plan on using the codecs in a commercial way.<\/p>\n<p>The <a href=\"https:\/\/wwwimages2.adobe.com\/content\/dam\/Adobe\/en\/devnet\/rtmp\/pdf\/rtmp_specification_1.0.pdf\" target=\"_blank\" rel=\"noopener\">rtmp<\/a>&nbsp;container is probably the most broadly used container with&nbsp;Twitch being one of its biggest users.<br \/>\nBut rtmp is a proprietary software by Adobe. The company&#8217;s support (or rather lack of) for some of their products should make you think about whether to bet on such a technology.<\/p>\n<h3>h264\/5 + mp4<\/h3>\n<p><a href=\"http:\/\/www.itu.int\/rec\/T-REC-H.265\" target=\"_blank\" rel=\"noopener\">h265<\/a> (HEVC) is merely an iteration of h264. Because it is relatively new, not all hardware supports this codec (only Intel &gt; Skylake, AMD -&gt; Carizzo, AMD &gt; Radeon X400, Nvidia &gt; GeForce 900).<\/p>\n<p>Unfortunately the mp4 container is not usable for live-streaming because of its necessity to know exactly how long the video should be (which is kind of hard to know in advance). <a href=\"http:\/\/www-itec.aau.at\/dash\/\" target=\"_blank\" rel=\"noopener\">MPEG-DASH<\/a>&nbsp;can be used as a workaround for the problem. It basically splits the video of unknown size into many small chunks with known size. This also allows you to change the resolution of the stream while playback (like YouTube does).<\/p>\n<h3>h264\/5 + mkv<\/h3>\n<p>Because of browser limitations this combination cannot be used to stream directly to the user. But if you are using a server to distribute the stream to all the users (what you will most certainly do), you can use this combination to stream against the server as&nbsp;an intermediate step in your lifestreaming pipeline.<br \/>\nThe open-source container <a href=\"https:\/\/www.matroska.org\/technical\/whatis\/index.html\" target=\"_blank\" rel=\"noopener\">matroska<\/a> is probably the best choice when it comes to streaming h264\/5, because unlike mp4 it doesn&#8217;t need to know how long the video is. Since it is supposed to be streamed via HTTP you also won&#8217;t run into any issues with restrictive firewalls.<\/p>\n<p>Because&nbsp;webm is a subset of matroska, you can very simply convert the container formats and transcode the video to vp8\/9 on a server and stream it to your users from there.<\/p>\n<h3>vp8\/9 + webm<\/h3>\n<p>Wanting to avoid license costs for h264\/5, Google implemented the open and royalty free <a href=\"https:\/\/www.webmproject.org\/\" target=\"_blank\" rel=\"noopener\">VP8\/9<\/a> codecs as well as&nbsp;the <a href=\"https:\/\/www.webmproject.org\/\" target=\"_blank\" rel=\"noopener\">webm<\/a> container. This combination shines especially in combination with the <a href=\"https:\/\/www.matroska.org\/technical\/whatis\/index.html\" target=\"_blank\" rel=\"noopener\">HTML5&nbsp;video-tag<\/a>. Being an HTML standard most common browsers (Chrome, Firefox, Edge) implement the codec and container. Building a player is as simple as inserting the video-tag into your page. VP9 offers far better efficiency with better image quality (especially when using low bitrates)&nbsp;than h264 while being resource-friendly on the decoding side. A&nbsp;very good tutorial on setting up VP9&nbsp;can be found in the <a href=\"https:\/\/sites.google.com\/a\/webmproject.org\/wiki\/ffmpeg\/vp9-encoding-guide\" target=\"_blank\" rel=\"noopener\">webm-wiki<\/a>.<\/p>\n<p>A real issue is Apple&#8217;s refusal to support this combination in Safari and on iOS. While on macOS you can simply switch to another browser, on iOS video playback isn&#8217;t supported at all.<\/p>\n<h3>Audio<\/h3>\n<p>So far we haven&#8217;t talked about audio a lot. But fortunately this is a very easy choice:<\/p>\n<ul>\n<li>If you are using h264\/5 in an mp4 or rtmp container you have to use either AAC or mp3.<\/li>\n<li>If you are using vp8\/9 you should use the open-source, royalty-free codec <a href=\"https:\/\/opus-codec.org\/\" target=\"_blank\" rel=\"noopener\">Opus<\/a>. It uses two components to be able to encode all audio situations with great efficiency. <a href=\"https:\/\/tools.ietf.org\/html\/rfc6716#page-8\" target=\"_blank\" rel=\"noopener\">SILK<\/a> is used for speech oriented audio and <a href=\"https:\/\/tools.ietf.org\/html\/rfc6716#page-8\" target=\"_blank\" rel=\"noopener\">CELT<\/a> for the rest (such as music).<\/li>\n<\/ul>\n<p>These&nbsp;three audio codecs <a href=\"http:\/\/caniuse.com\/#search=audio\" target=\"_blank\" rel=\"noopener\">are supported<\/a> on almost all operating systems and browsers (Safari being once again the great exception). If you want to be 100% sure to cover all platforms use mp3.<\/p>\n<h2>A video codec to rule them all?<\/h2>\n<p>In the (hopefully) very near future a new video codec will arise and lay waste to all codecs that existed before. Because of compatibility issues but mainly because of legal reasons, the <a href=\"http:\/\/aomedia.org\/\" target=\"_blank\" rel=\"noopener\">Alliance for Open Media<\/a>, including but not limited to Amazon, AMD, ARM, Broadcom, Cisco, Google, Intel, Microsoft, Mozilla, and Netflix (basically all ginormous&nbsp;IT companies) decided to implement a new codec that will be completely open-source, patent-free and royalty-free: the <a href=\"https:\/\/aomedia.googlesource.com\/aom\/+\/master\/README\" target=\"_blank\" rel=\"noopener\">AV1 codec<\/a>. <a href=\"https:\/\/groups.google.com\/a\/webmproject.org\/forum\/#!topic\/codec-devel\/CD921b8brEk\" target=\"_blank\" rel=\"noopener\">Early tests have already shown great efficiency<\/a> and&nbsp;an image quality that may even surpass h265.<\/p>\n<h2>tl;dr<\/h2>\n<p>If you just want a simple and free solution, use&nbsp;the webm container with the vp9 video codec and Opus. Ignore iOS devices, Apple has to learn at some point. Also pray for AV1 to be usable.<\/p>\n<figure style=\"width: 250px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/giphy.com\/gifs\/justin-hope-hoping-fingers-crossed-l0NwNrl4BtDD7JCx2\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/08\/giphy.gif\" alt=\"\" width=\"250\" height=\"141\"><\/a><figcaption class=\"wp-caption-text\"><a href=\"https:\/\/giphy.com\/gifs\/justin-hope-hoping-fingers-crossed-l0NwNrl4BtDD7JCx2\" target=\"_blank\" rel=\"noopener\">https:\/\/giphy.com\/gifs\/justin-hope-hoping-fingers-crossed-l0NwNrl4BtDD7JCx2<\/a> via <a href=\"https:\/\/giphy.com\/\" target=\"_blank\" rel=\"noopener\">GIPHY<\/a><\/figcaption><\/figure>\n<p>&nbsp;<\/p>\n<h5>Image sources:<\/h5>\n<ul>\n<li>title image: <a href=\"https:\/\/commons.wikimedia.org\/wiki\/File:Green_screen_live_streaming_production_at_Mediehuset_K%C3%B8benhavn.jpg\" target=\"_blank\" rel=\"noopener\">https:\/\/commons.wikimedia.org\/wiki\/File:Green_screen_live_streaming_production_at_Mediehuset_K%C3%B8benhavn.jpg<\/a>, Author: Rehak<\/li>\n<li>gif:<br \/>\n<a href=\"https:\/\/giphy.com\/gifs\/justin-hope-hoping-fingers-crossed-l0NwNrl4BtDD7JCx2\" target=\"_blank\" rel=\"noopener\">https:\/\/giphy.com\/gifs\/justin-hope-hoping-fingers-crossed-l0NwNrl4BtDD7JCx2<\/a>, via <a href=\"https:\/\/giphy.com\/\" target=\"_blank\" rel=\"noopener\">GIPHY<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Lifestreaming is the real deal of video today, however&nbsp;there aren&#8217;t that many content creation tools to choose from.&nbsp;YouTube, Facebook and Twitter are pushing hard to enable their users to stream vlogging-style content live from their phones with proprietary Apps, and OBS is used for Let&#8217;s Plays and Twitch streams. But when you want to stream [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[649,653,22,657],"tags":[4,102,103,104],"ppma_author":[681],"class_list":["post-2120","post","type-post","status-publish","format-standard","hentry","category-interactive-media","category-mobile-apps","category-student-projects","category-teaching-and-learning","tag-linux","tag-livestreaming","tag-streaming","tag-video"],"aioseo_notices":[],"jetpack_featured_media_url":"","jetpack-related-posts":[{"id":3114,"url":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/2017\/09\/01\/sport-data-stream-processing-on-ibm-bluemix-real-time-stream-processing-basics\/","url_meta":{"origin":2120,"position":0},"title":"Sport data stream processing on IBM Bluemix:  Real Time Stream Processing Basics","author":"nk065@hdm-stuttgart.de","date":"1. September 2017","format":false,"excerpt":"New data is created every second. Just on Google the humans preform 40,000 search queries every second. By 2020 Forbes estimate 1.7 megabytes of new information will be created every second for every human on our planet. However, it is about collecting and exchanging data, which then can be used\u2026","rel":"","context":"In &quot;Allgemein&quot;","block_context":{"text":"Allgemein","link":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/category\/allgemein\/"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2017\/09\/Real-Time-Stream-Processing-Basics_6.png?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2017\/09\/Real-Time-Stream-Processing-Basics_6.png?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2017\/09\/Real-Time-Stream-Processing-Basics_6.png?resize=525%2C300&ssl=1 1.5x"},"classes":[]},{"id":2179,"url":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/2018\/03\/21\/livestreaming-with-libav-tutorial-part-2\/","url_meta":{"origin":2120,"position":1},"title":"Livestreaming with libav* &#8211; Tutorial (Part 2)","author":"Benjamin Binder","date":"21. March 2018","format":false,"excerpt":"If\u00a0you want to create videos\u00a0using FFmpeg\u00a0there is a basic\u00a0pipeline setup to go with. We will first take a short overview over this pipeline and then\u00a0focus on each individual section. The basic pipeline I'm assuming you have already captured your video\/audio data. Since this step is highly platform dependent it will\u2026","rel":"","context":"In &quot;Student Projects&quot;","block_context":{"text":"Student Projects","link":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/category\/student-projects\/"},"img":{"alt_text":"Green screen live streaming production at Mediehuset K\u00f8benhavn","src":"https:\/\/upload.wikimedia.org\/wikipedia\/commons\/a\/af\/Green_screen_live_streaming_production_at_Mediehuset_K%C3%B8benhavn.jpg","width":350,"height":200,"srcset":"https:\/\/upload.wikimedia.org\/wikipedia\/commons\/a\/af\/Green_screen_live_streaming_production_at_Mediehuset_K%C3%B8benhavn.jpg 1x, https:\/\/upload.wikimedia.org\/wikipedia\/commons\/a\/af\/Green_screen_live_streaming_production_at_Mediehuset_K%C3%B8benhavn.jpg 1.5x, https:\/\/upload.wikimedia.org\/wikipedia\/commons\/a\/af\/Green_screen_live_streaming_production_at_Mediehuset_K%C3%B8benhavn.jpg 2x, https:\/\/upload.wikimedia.org\/wikipedia\/commons\/a\/af\/Green_screen_live_streaming_production_at_Mediehuset_K%C3%B8benhavn.jpg 3x, https:\/\/upload.wikimedia.org\/wikipedia\/commons\/a\/af\/Green_screen_live_streaming_production_at_Mediehuset_K%C3%B8benhavn.jpg 4x"},"classes":[]},{"id":10289,"url":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/2020\/03\/09\/distributed-stream-processing-frameworks-what-they-are-and-how-they-perform\/","url_meta":{"origin":2120,"position":2},"title":"Distributed stream processing frameworks &#8211; what they are and how they perform","author":"Alexander Merker","date":"9. March 2020","format":false,"excerpt":"An overview on stream processing, common frameworks as well as some insights on performance based on benchmarking data","rel":"","context":"In &quot;Allgemein&quot;","block_context":{"text":"Allgemein","link":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/category\/allgemein\/"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/08\/storm_arch.png?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/08\/storm_arch.png?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/08\/storm_arch.png?resize=525%2C300&ssl=1 1.5x"},"classes":[]},{"id":4164,"url":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/2018\/08\/31\/tweets-by-donnie-building-a-serverless-sentiment-analysis-application-with-the-twitter-streaming-api-lambda-and-kinesis\/","url_meta":{"origin":2120,"position":3},"title":"Tweets by Donnie\u200a-\u200aBuilding a serverless sentiment analysis application with the twitter streaming API,  Lambda and Kinesis","author":"dr053","date":"31. August 2018","format":false,"excerpt":"tweets-by-donnie dashboard \u00a0 Thinking of Trumps tweets it's pretty obvious that they are controversial. Trying to gain insights of how controversial his tweets really are, we created tweets-by-donnie. \u201cIt\u2019s freezing and snowing in New York\u200a\u2014\u200awe need global warming!\u201d Donald J. Trump You decide if it\u2019s meant as a joke or\u2026","rel":"","context":"In &quot;Cloud Technologies&quot;","block_context":{"text":"Cloud Technologies","link":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/category\/scalable-systems\/cloud-technologies\/"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":24588,"url":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/2023\/05\/12\/cloud-gaming-quality-factors\/","url_meta":{"origin":2120,"position":4},"title":"Evaluating Cloud Gaming Services: Uncovering Key Quality Factors with Engaging Examples","author":"Milos Aleksic","date":"12. May 2023","format":false,"excerpt":"Cloud Gaming, Source: Ajjan (2019) Introduction Cloud gaming services have gained significant traction in recent years. They allow users to play high-quality games without needing powerful hardware. This technology revolutionizes the gaming industry by enabling gamers to stream games on-demand, regardless of their device's capabilities. One key benefit of cloud\u2026","rel":"","context":"In &quot;Allgemein&quot;","block_context":{"text":"Allgemein","link":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/category\/allgemein\/"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/05\/Bildschirmfoto-2023-05-12-um-13.07.19.png?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/05\/Bildschirmfoto-2023-05-12-um-13.07.19.png?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/05\/Bildschirmfoto-2023-05-12-um-13.07.19.png?resize=525%2C300&ssl=1 1.5x, https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/05\/Bildschirmfoto-2023-05-12-um-13.07.19.png?resize=700%2C400&ssl=1 2x, https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/05\/Bildschirmfoto-2023-05-12-um-13.07.19.png?resize=1050%2C600&ssl=1 3x"},"classes":[]},{"id":10318,"url":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/2020\/04\/13\/open-source-batch-and-stream-processing-realtime-analysis-of-big-data\/","url_meta":{"origin":2120,"position":5},"title":"Open Source Batch and Stream Processing: Realtime Analysis of Big Data","author":"Marcel Stolin","date":"13. April 2020","format":false,"excerpt":"Abstract Since the beginning of Big Data, batch processing was the most popular choice for processing large amounts of generated data. These existing processing technologies are not suitable to process the large amount of data we face today. Research works developed a variety of technologies that focus on stream processing.\u2026","rel":"","context":"In &quot;Allgemein&quot;","block_context":{"text":"Allgemein","link":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/category\/allgemein\/"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/08\/mapreduce.jpg?resize=350%2C200&ssl=1","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/08\/mapreduce.jpg?resize=350%2C200&ssl=1 1x, https:\/\/i0.wp.com\/blog.mi.hdm-stuttgart.de\/wp-content\/uploads\/2023\/08\/mapreduce.jpg?resize=525%2C300&ssl=1 1.5x"},"classes":[]}],"jetpack_sharing_enabled":true,"authors":[{"term_id":681,"user_id":5,"is_guest":0,"slug":"bb074","display_name":"Benjamin Binder","avatar_url":"https:\/\/secure.gravatar.com\/avatar\/b39750be005f19ce71d3af93115f9d5f02d18769c36bfa750ca4de423b20d981?s=96&d=mm&r=g","0":null,"1":"","2":"","3":"","4":"","5":"","6":"","7":"","8":""}],"_links":{"self":[{"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/posts\/2120","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/comments?post=2120"}],"version-history":[{"count":42,"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/posts\/2120\/revisions"}],"predecessor-version":[{"id":25497,"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/posts\/2120\/revisions\/25497"}],"wp:attachment":[{"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/media?parent=2120"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/categories?post=2120"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/tags?post=2120"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/blog.mi.hdm-stuttgart.de\/index.php\/wp-json\/wp\/v2\/ppma_author?post=2120"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}