• TechTok Newsletter
  • Posts
  • Apple's Intelligence 🍎, Quantum Science Centenary 🔬, 100x Any CPU 💻

Apple's Intelligence 🍎, Quantum Science Centenary 🔬, 100x Any CPU 💻

Today's TechToks: Apple's groundbreaking WWDC24, parallel processing unit chips for CPUs, OpenAI CTO Mira Murati's comeback to Elon Musk, standout GitHub repos, and Product Picks. Dive in for more!

Quick Note

Help us reaching more people and get your shout out in our newsletter 🤩!

Copy and share your unique referral link:
https://newsletter.techtok.today/subscribe?ref=PLACEHOLDER

Each week we spend several hours curating and crafting the best of new products, trending repos, tech news, and engineering blog posts. Share TechTok and help others who want to stay up to date with the latest in tech.

Today’s Summaries

See all as stories in techtok.today

🍎 Apple WWDC Highlights: Apple Intelligence, big tech partnerships and OS updates

⚗️ From CERN: International Year of Quantum Science Celebration

🚄 Flow Computing: Proprietary chip could make CPUs 100x faster

📦 From AWS Engineering Blog: Your AI is only as good as your data

🔥 OpenAI fires back at Musk following critics to IOS announcement

Product Picks

🐕 Fetch.ai: An open platform for the new AI economy. Monetise your own AI agents.

🦾 TeamCreate: AI workers for hundreds of roles in sales, finance & more

🎬 Zeacon: The 24/7 video marketer

Github Trending Repositories

💫 Transformers.js: State-of-the-art Machine Learning for the web. Run Hugging Face Transformers directly in your browser, with no need for a server!

🤏 nanoGPT: Smple and fast repository for training/finetuning medium-sized GPTs.

☁️ public-image-mirror: A mirror of public Docker images

🗣️ mi-gpt: Connect Xiaomi smart speaker to ChatGPT and Doubao

🤯 Game Changers

The most impactful articles of the day

Apple kicked off its 2024 Worldwide Developers Conference, unveiling groundbreaking new technologies like Apple Intelligence, a personal intelligence system that combines generative models with user context, and brings unexpected partnerships with big tech like OpenAI and (expectedly) Google for LLM integration options. Major updates were announced for iOS 18, iPadOS 18, macOS Sequoia, watchOS 11, and visionOS 2, including redesigns, productivity tools, and new ways to interact with Apple Vision Pro.

"Too long don't README"

GitHub trending repos of today

nanoGPT is a simple and fast repository for training and fine-tuning medium-sized GPT models, including reproducing GPT-2 (124M) on the OpenWebText dataset in around 4 days on a single 8XA100 40GB node. The code is highly readable and easy to hack, allowing users to train new models from scratch or fine-tune pre-trained checkpoints. What you can do with it: train character-level GPT models on small datasets like Shakespeare, fine-tune pre-trained GPT-2 models, or reproduce the GPT-2 (124M) model on larger datasets like OpenWebText.

This repository provides a simple and effective way to accelerate the download of public Docker images hosted on various registries, such as gcr.io, by mirroring them on a DaoCloud server, with features like easy addition of new images, real-time updates, and various usage methods including prefix replacement and lazy loading, along with best practices for using the accelerated images for kubeadm, kind, and Ingress deployment.

MiGPT is a project that integrates the capabilities of the Xiaomi smart speaker (Xiaoai) with the understanding of ChatGPT, creating an intelligent home assistant that can respond to your commands and provide a personalized smart home experience. It allows for AI-powered voice interactions, role-playing, customizable text-to-speech, and integration with smart home devices, making your home more intelligent and responsive to your needs.

Transformers.js is a JavaScript library that allows you to run state-of-the-art machine learning models directly in the browser, without the need for a server. It supports various tasks such as natural language processing, computer vision, audio, and multimodal applications, and provides a similar API to the Hugging Face Transformers Python library. The library uses ONNX Runtime to run the models efficiently in the browser, and you can easily convert your pretrained PyTorch, TensorFlow, or JAX models to ONNX format using the Hugging Face Optimum library.

📱 Product Picks

Curated products from Product Hunt

Create multi-function AI workers for hundreds of roles in Finance, Sales, Product, Marketing, and more. Deploy from Slack, give them an email and secure access to 200+ apps. Assign tasks from Trello, Jira, and more. $50 FREE credit—no credit card required.

Zeacon hosts, organizes, and analyzes interactive videos for your website. As an integrated team member, the AI Marketer is continuously learning to attract, engage and convert more website visitors so you don't have to.

Build, deploy and monetize AI services— a decentralized open-source platform combining blockchain with AI. Transform your legacy systems to be AI ready without changing your existing APIs, and allow your users to access multiple services from a single prompt with DeltaV.

🧐 Daily Picks

Curated picks and most shared articles on techtok.today

It's been 100 years since a milestone for quantum science: matter can be both point- or wave-like. It based the birth of modern quantum mechanics around 1925 and granted Broglie a Nobel Prize for demonstrating it. In celebration, the United Nations declared 2025 the International Year of Quantum Science and Technology.

OpenAI's Mira Murati defends the company's partnership with Apple, dismissing Elon Musk's criticism of it as 'creepy spyware'. After Apple anounced ChatGPT services in the iPhones 15 Pro & Max, Elon tweeted to ban devices from entering his company, among many other critics. In retaliation, Murati emphasizes OpenAI's focus on user privacy and transparency, naming the biggest risk to be “that stakeholders misunderstand the technology."

Flow Computing, a Finnish startup, claims to have developed a parallel processing unit (PPU) that can boost CPU performance by up to 100x. It could double the performance of existing computing code overnight, without any optimization by programmers. While the claims are bold, Flow Computing has not yet built a chip and is seeking partnerships with major chipmakers to bring its patented techniques to market.

Engineering Blogs

Articles from engineering blogs of big tech companies

With a study finding that 93% of Chief Data Officers agree: data strategy is crucial for getting value from generative AI, but 57% have yet to built the necessary strategy. An AWS article on how Generative AI's effectiveness is deeply rooted in the quality, diversity, and robustness of the training datasets leveraged. Tom Godden recommends the following principles: Treat Data as a Product, Curate Diverse Datasets, Govern by Enabling, Not Restricting, Documentation that Empowers, Ensure Data Quality, and Respect Privacy, Consent, and Confidentiality.