mastodon.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
The original server operated by the Mastodon gGmbH non-profit

Administered by:

Server stats:

343K
active users

#alexnet

2 posts2 participants0 posts today
kcnickerson<p>AI&#39;s &quot;Back to the Future&quot; - Open-Source Code to AlexNet, the neural network that started a Trillion(s) dollar, geopolitical race =O @geoffreyhinton <a href="https://mastodon.social/tags/ai" class="mention hashtag" rel="tag">#<span>ai</span></a> <a href="https://mastodon.social/tags/alexnet" class="mention hashtag" rel="tag">#<span>alexnet</span></a> <a href="https://mastodon.social/tags/code" class="mention hashtag" rel="tag">#<span>code</span></a> <a href="https://github.com/computerhistory/AlexNet-Source-Code" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">github.com/computerhistory/Ale</span><span class="invisible">xNet-Source-Code</span></a></p>
Habr<p>Код, который все изменил: история AlexNet и ее наследие</p><p>В марте 2025 года Компьютерный исторический музей (Computer History Museum) совместно с Google опубликовал исходный код AlexNet — нейросети, которая в 2012 году привлекла внимание к возможностям глубокого обучения. Исследователи и энтузиасты получили полные исходники модели, ставшей одним из ключевых этапов в развитии компьютерного зрения. Почему эта сеть настолько важна для ИТ, а ее вклад считают значимым? Давайте разбираться.</p><p><a href="https://habr.com/ru/companies/ru_mts/articles/896478/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">habr.com/ru/companies/ru_mts/a</span><span class="invisible">rticles/896478/</span></a></p><p><a href="https://zhub.link/tags/%D0%BD%D0%B5%D0%B9%D1%80%D0%BE%D1%81%D0%B5%D1%82%D0%B8" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>нейросети</span></a> <a href="https://zhub.link/tags/alexnet" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>alexnet</span></a> <a href="https://zhub.link/tags/%D0%B8%D1%81%D0%BA%D1%83%D1%81%D1%81%D1%82%D0%B2%D0%B5%D0%BD%D0%BD%D1%8B%D0%B9_%D0%B8%D0%BD%D1%82%D0%B5%D0%BB%D0%BB%D0%B5%D0%BA%D1%82" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>искусственный_интеллект</span></a> <a href="https://zhub.link/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a> <a href="https://zhub.link/tags/%D0%BC%D0%B0%D1%88%D0%B8%D0%BD%D0%BD%D0%BE%D0%B5" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>машинное</span></a>+обучение <a href="https://zhub.link/tags/%D0%BC%D0%B0%D1%88%D0%B8%D0%BD%D0%BD%D0%BE%D0%B5_%D0%BE%D0%B1%D1%83%D1%87%D0%B5%D0%BD%D0%B8%D0%B5" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>машинное_обучение</span></a> <a href="https://zhub.link/tags/%D0%B2%D1%8B%D1%81%D0%BE%D0%BA%D0%B0%D1%8F_%D0%BF%D1%80%D0%BE%D0%B8%D0%B7%D0%B2%D0%BE%D0%B4%D0%B8%D1%82%D0%B5%D0%BB%D1%8C%D0%BD%D0%BE%D1%81%D1%82%D1%8C" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>высокая_производительность</span></a></p>
ResearchBuzz: Firehose<p>Ars Technica: You can now download the source code that sparked the AI boom. “On Thursday, Google and the Computer History Museum (CHM) jointly released the source code for AlexNet, the convolutional neural network (CNN) that many credit with transforming the AI field in 2012 by proving that ‘deep learning’ could achieve things conventional AI techniques could not.”</p><p><a href="https://rbfirehose.com/2025/03/25/ars-technica-you-can-now-download-the-source-code-that-sparked-the-ai-boom/" class="" rel="nofollow noopener" target="_blank">https://rbfirehose.com/2025/03/25/ars-technica-you-can-now-download-the-source-code-that-sparked-the-ai-boom/</a></p>
WinFuture.de<p>Der Quellcode von AlexNet, dem Auslöser des modernen KI-Booms, ist jetzt als Open Source verfügbar. Ein Blick in die Anfänge der KI-Revolution von 2012. <a href="https://mastodon.social/tags/AlexNet" class="mention hashtag" rel="tag">#<span>AlexNet</span></a> <a href="https://mastodon.social/tags/KI" class="mention hashtag" rel="tag">#<span>KI</span></a> <a href="https://winfuture.de/news,149842.html?utm_source=Mastodon&amp;utm_medium=ManualStatus&amp;utm_campaign=SocialMedia" target="_blank" rel="nofollow noopener" translate="no"><span class="invisible">https://</span><span class="ellipsis">winfuture.de/news,149842.html?</span><span class="invisible">utm_source=Mastodon&amp;utm_medium=ManualStatus&amp;utm_campaign=SocialMedia</span></a></p>
Benjamin Carr, Ph.D. 👨🏻‍💻🧬<p>How a stubborn <a href="https://hachyderm.io/tags/computerscientist" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>computerscientist</span></a> accidentally launched the <a href="https://hachyderm.io/tags/deeplearning" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>deeplearning</span></a> boom<br>"You’ve taken this idea way too far," a mentor told Prof. Fei-Fei Li, who was creating a new image <a href="https://hachyderm.io/tags/dataset" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>dataset</span></a> that would be far larger than any that had come before: 14 million images, each labeled with one of nearly 22,000 categories. Then in 2012, a team from Univ of Toronto trained a <a href="https://hachyderm.io/tags/neura" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>neura</span></a> network on <a href="https://hachyderm.io/tags/ImageNet" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ImageNet</span></a>, achieving unprecedented performance in image recognition, dubbed <a href="https://hachyderm.io/tags/AlexNet" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AlexNet</span></a>.<br><a href="https://arstechnica.com/ai/2024/11/how-a-stubborn-computer-scientist-accidentally-launched-the-deep-learning-boom/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">arstechnica.com/ai/2024/11/how</span><span class="invisible">-a-stubborn-computer-scientist-accidentally-launched-the-deep-learning-boom/</span></a> <a href="https://hachyderm.io/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a></p>
SecurityLab<p>Эксперты научились распознавать самолеты по отпечаткам радиосигналов АЗН-В <a href="https://phreedom.tk/tags/%D1%80%D0%B0%D0%B4%D0%B8%D0%BE" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>радио</span></a>, <a href="https://phreedom.tk/tags/%D1%81%D0%B0%D0%BC%D0%BE%D0%BB%D0%B5%D1%82" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>самолет</span></a>, <a href="https://phreedom.tk/tags/%D0%90%D0%97%D0%9D" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>АЗН</span></a>-В, <a href="https://phreedom.tk/tags/RTL" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RTL</span></a>-SDR, <a href="https://phreedom.tk/tags/Alexnet" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Alexnet</span></a>, <a href="https://phreedom.tk/tags/GoogLeNet" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>GoogLeNet</span></a> <a href="https://www.securitylab.ru/news/514039.php" rel="nofollow noopener" target="_blank"><span class="invisible">https://www.</span><span class="">securitylab.ru/news/514039.php</span><span class="invisible"></span></a> <a href="https://twitter.com/SecurityLabnews/status/1326898806168891397/photo/1" rel="nofollow noopener" target="_blank"><span class="invisible">https://</span><span class="ellipsis">twitter.com/SecurityLabnews/st</span><span class="invisible">atus/1326898806168891397/photo/1</span></a></p>