You are viewing a single comment's thread from:

RE: LeoThread 2024-08-29 05:02

in LeoFinance4 months ago

Here is the daily technology #threadcast for 8/29/24. We aim to educate people about this crucial area along with providing information of what is taking place.

Drop all question, comments, and articles relating to #technology and the future. The goal is make it a technology center.

Sort:  

I recently discovered a website that can be used to create Apps without a developer support #technology

In the world of technology, constant innovation is key to keeping pace with the growing demand for processing and efficiency. One of the latest revolutions in the microprocessor industry is the adoption of chiplets #leofinance #pepe

That is true. Things are moving quickly.

The basics of the technology of raising meat, lard and bacon pigs have not changed over many centuries, only the diet has slightly changed taking into account the chemistry and the stalls have become more technologically advanced :) !BEER

CNBC -- How The Massive Power Draw Of Generative AI Is Overtaxing Our Grid #technology #ai

The Guardian: Nvidia shares fall after investors spooked by slowing growth

Doubling of quarterly revenues fails to allay concerns about production delays to its next-generation of AI chips



https://www.theguardian.com/technology/article/2024/aug/29/nvidia-shares-fall-after-investors-spooked-by-slowing-growth-delays-next-generation-ai-chips

OpenAI’s Converge 2 program has been shrouded in mystery

For months, a tech forum ran wild asking if the Converge 2 accelerator program actually happened.

Last December, OpenAI announced it was opening applications for its second cohort of a six-week accelerator program it had kicked off in 2022. Then things grew very quiet. But OpenAI’s Converge 2 startup accelerator program did indeed take place, according to two sources familiar with the program. TechCrunch also confirmed that the Open AI Startup Fund has funded startups in the cohort.

#ai #openai #technology #newsonleo

Converge 2, as that second accelerator program was called, has been an odd exception to what is normally a mundane bit of PR in the tech industry. Typically an accelerator program announces the startups accepted into, or graduated from, its programs. After all, once it invests, it has every reason to boost those startups’ chances for success with its public stamp of approval.

But for months, a tech forum ran wild, asking if Converge 2 accelerator program actually happened. Nobody on that forum, which included people who said they applied, heard anything. No one on the forum posted about getting in or even receiving rejection notices. And others in the Valley weren’t hearing about the program either, despite how high profile OpenAI is, various sources told TechCrunch.

This silence was odd because it was not how the first program worked. OpenAI held Converge I in the typical accelerator fashion in early 2023. It called for applications and then publicly announced its first four investments from the overall OpenAI Startup Fund. In December 2023, Open AI Startup Fund announced it would start accepting applications for the second cohort of its accelerator program, Converge 2, expected to begin in March 2024 and conclude by April.

But then, silence. There was no press release from OpenAI about the companies that earned investments. And after multiple requests for comment over months, OpenAI refused to even confirm that the program took place. Founders took to pondering online — and messaging us — about it.

OpenAI reportedly in talks to close a new funding round at $100B+ valuation

The Wall Street Journal reports that OpenAI is close to closing a funding tranche that would value the startup at over $100 billion.

ChatGPT creator OpenAI is reportedly in talks to raise a massive tranche of cash led by previous backer Thrive Capital, at a valuation of more than $100 billion. That’s substantially higher than OpenAI’s previous $86 billion valuation, and by far the highest valuation of any AI startup to date.

#openai #funding #technology #newsonleo

Thrive will reportedly throw in around $1 billion for its part. Microsoft is also expected to participate, according to the WSJ, which has the story. It’s unclear would join the round. Khosla Ventures, Infosys and Y Combinator are among OpenAI’s other existing backers.

The financing is poised to serve as OpenAI’s biggest outside infusion of capital since January 2023, when Microsoft invested nearly $10 billion.

OpenAI needs the money. The WSJ says its annualized revenue eclipsed $3.4 billion early this year. The Information says OpenAI is on track to lose nearly $5 billion by year end, however, and that it has already burned through $8.5 billion on AI training and staffing.

Apple's new iOS developer beta lets you remove objects from pictures using AI

Apple has added yet more AI features in its latest developer betas for iOS 18.1, and this time we're getting the ability to remove objects from photos.

Apple has added yet more AI features in its latest developer betas for iOS 18.1, iPadOS 18.1, and macOS 15.1, and this time we’re getting the ability to remove objects from photos.

The feature, called Clean Up, lets users identify and remove objects from pictures without affecting the rest of the photo. Users can select an object using smart detection and remove it with just one tap. You can also draw a circle or brush over any unwanted objects to delete them from the image.

#apple #newsonleo #technology #ai

According to Apple, the feature uses AI to generate the background when you remove an object from an image. Apple said the system even understands shadows or reflections of objects and makes sure to remove them when filling in the background.

Google earlier this year launched a similar feature called Magic Eraser for Google Photos for free.

Apple has slowly been testing its initial set of Apple Intelligence features with the iOS 18.1 developer betas since July. These features include writing tools; notification summaries for SMS and Mail; natural language search in Photos; transcription for calls and voice recordings in Notes; and a smart replies feature in Mail. Apple Intelligence is only available to users in English who have their region selected as the U.S.

France formally charges Telegram founder, Pavel Durov, over organized crime on messaging app

After spending four days in police custody, the founder and CEO of messaging app Telegram, Pavel Durov

After spending four days in police custody, the founder and CEO of messaging app Telegram, Pavel Durov, was put under formal investigation in France on Thursday for a wide range of criminal charges.

He was also released from custody on the condition that he does not leave French territory during the investigation. Bail was set at €5 million (around $5.6 million) and he must check in at a police station twice a week.

#newsonleo #france #telegram #technology

On Tuesday, the court in charge of the investigation — the Paris criminal court — shared a list of charges that led to Durov’s arrest on August 24 after he stepped off his private jet at France’s Le Bourget Airport.

The main charges are accusations of running a company that has been complicit in storing and distributing child sexual abuse material (CSAM), facilitating drug trafficking and facilitating organized fraud and other illegal transactions. He’s also facing charges concerning registration of Telegram’s cryptographic features, among other criminal complaints.

In a statement, Paris prosecutor Laure Beccuau confirmed all the charges and said the investigation is moving forward. The fact of Durov being placed under formal investigation means he is formally suspected of what he’s been accused of.

In the French judicial system being placed under formal investigation is a necessary procedural step in order to take a case to trial. However it does not mean a trial is certain. If authorities decide they don’t have a strong enough case the investigation could be shelved before it gets to court. But the step means there’s a substantial reason to pursue a probe.

AI dominates Reliance's annual general meeting

Reliance Industries, India's largest company by market capitalization, is not sitting out the AI frenzy that has gripped the tech world.

Reliance Industries, India’s largest company by market capitalization, is not ignoring the AI frenzy that has gripped the tech world.

At the company’s 47th annual general meeting on Thursday, Reliance chairman Mukesh Ambani and other executives mentioned AI nearly three-dozen times within two hours, detailing plans to integrate the technology across the company’s various businesses across telecommunications, retail and energy.

#newsonleo #reliance #india #technology

Reliance also said it is planning to set up large-scale, AI-ready data centers in Jamnagar, in the western state of Gujarat, that will be powered by the company’s green energy resources. Ambani claimed these facilities would enable lower AI inferencing costs in India, potentially making AI applications more affordable. However, the company was short on details about the feasibility of its plans and the timeline for achieving these goals.

Reliance’s telecom arm, Jio, is developing a “comprehensive” AI suite called “Jio Brain” to accelerate AI adoption across its operations and other Reliance companies. The firm is partnering with Jio Institute to develop an AI program aimed at cultivating AI talent in India.

Reliance plans to offer Jio users up to 100 GB of free cloud storage through its Jio AI-Cloud service, set to launch around Diwali in October, Ambani said. During the presentation, the company also showed off JioPhonecall AI, a call recording and transcription service, and introduced Jio TvOS, an operating system for its set-top boxes that will feature an AI voice assistant.

India has lagged other countries in the AI race, but the last year has seen a flurry of activity from conglomerates like Reliance and Tata Group (both of which have partnered with Nvidia) as well as startups pushing to develop AI apps and infrastructure.

Delivery Hero shares pop 10% as European food delivery giant plans IPO of Middle East unit

Delivery Hero CEO Niklas Ostberg said the IPO of its Middle East business was about bringing in local investors in the region to co-invest and support the unit.

European food delivery firm Delivery Hero says it plans to spin off its Middle East business Talabat and float it on the Dubai stock exchange later this year.

In a brief statement Thursday, Delivery Hero said that it is "preparing a listing of its Talabat business on the Dubai Financial Market" in the fourth quarter.

#technology #deliveryhero #ipo #newsonleo

"A listing may be pursued through a secondary sale of shares by Delivery Hero which would retain the majority interest in the local listing entity after an IPO," Delivery Hero said in a statement.

Shares of Delivery Hero jumped nearly 10% Thursday on the back of the Talabat IPO news as of 7:30 a.m. ET, extending gains from earlier in the session amid a broader uptick in European shares.

The company did not disclose a valuation, share price target, volume of shares that it will list, or a specific timeline beyond the fourth-quarter indication.

The potential Talabat IPO remains subject to market conditions, approval of a prospectus by securities regulators and clearance from Delivery Hero's management and supervisory boards.

Speaking on a call with CNBC following the news, Delivery Hero CEO Niklas Ostberg said that the proposed Talabat market listing was about bringing in local investors in the region to co-invest and support the unit.

"We have been looking into those opportunities to bring in stronger support," Ostberg told CNBC Thursdsay. "The Middle East is a very large part of our business and ... there is clear value for us to further build a strong base there."

The Accelerating Rate of Consumer Technology Diffusion Reveals the Future

#technology

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Nvidia shares dip in premarket trade despite earnings beating estimates

Nvidia shares fell 4.5% in U.S. premarket trade on Thursday despite its fiscal second-quarter results beating estimates.

Nvidia shares dipped in U.S. premarket trade on Thursday, as the company's fiscal second-quarter gross margin dipped slightly, and its revenue beat was eclipsed by a backdrop of increasingly lofty expectations.

The company's stock was down 4.6% in premarket deals, but had pared losses to a drop of 3.6% at about 8:15 ET.

#stock #technology #nvidia

Nvidia reported July quarter revenue on Wednesday of more than $30 billion, up 122% year on year.

It was the fourth straight quarter of triple-digit revenue growth. But as Nvidia continues its rapid expansion, the annual comparisons are getting tougher.

Nvidia issued market-beating revenue guidance for its fiscal third quarter of $32.5 billion. That would imply an 80% year-on-year increase, but a slowdown from the June quarter.

Meanwhile, the company said that gross margins would be in the "mid-70% range" for the full year. Analysts were expecting a full-year margin of 76.4%, according to StreetAccount.

However, analysts said that Nvidia would have had to beat all expectations by a long way, in order to see a pop in the stock after the numbers.

The pullback in the stock on Thursday also comes after a meteoric rally, with Nvidia's shares rising more than 150% this year to date. The stock has picked up more than 750% since the start of 2023, as one of the biggest beneficiaries of the artificial intelligence boom. Large technology companies have been ramping up investment and buying Nvidia's graphics processing units to train large AI models.

What is the theory of Computronium?

The theory of Computronium is a speculative concept in the fields of computer science, physics, and futurism. It refers to a hypothetical form of programmable matter that can be used to perform computations in the most efficient way physically possible.

Key aspects of the Computronium theory include:

  1. Maximized computation: Computronium is theorized to use every available atom or particle to perform calculations, maximizing the computational power per unit of mass and volume.

  2. Physical limits: It's thought to operate at the theoretical limits of computation as allowed by the laws of physics, such as those described by Landauer's principle and the Bekenstein bound.

  1. Reconfigurability: In theory, Computronium could be reprogrammed to perform any computational task, making it extremely versatile.

  2. Energy efficiency: It would ideally use minimal energy for its operations, potentially approaching the thermodynamic limits of computation.

  3. Futuristic applications: Some futurists speculate that advanced civilizations might convert large amounts of matter (even entire planets or stars) into Computronium to harness immense computational power.

It's important to note that Computronium remains a purely theoretical concept. We currently have no practical way to create such a substance, and it may not be physically possible given our current understanding of physics. The concept is often discussed in the context of technological singularity, advanced artificial intelligence, and far-future scenarios.

What would be required to achieve nanotechnology level computronium?

The theory of computronium is a speculative concept in advanced physics and computer science. It refers to a hypothetical form of programmable matter that can be used to perform computation in its most efficient possible form. Essentially, computronium is proposed as the ultimate computing substrate - a material engineered at the atomic or subatomic level to maximize computational power and efficiency.

To achieve nanotechnology-level computronium, several significant technological breakthroughs would likely be required:

  1. Advanced nanotechnology: The ability to precisely manipulate and control matter at the atomic and molecular scale would be essential. This would involve developing nanoscale assemblers or fabricators capable of arranging atoms and molecules with extreme precision.
  1. Quantum computing advancements: Many theories of computronium involve leveraging quantum effects for computation. Significant progress in quantum computing technology would likely be necessary.

  2. Novel materials science: New materials with properties that can be manipulated at the quantum level for computation would need to be developed.

  3. Energy efficiency breakthroughs: To approach theoretical limits of computational efficiency, new methods of minimizing energy loss in computation would be crucial.

  4. Advanced cooling systems: Dealing with heat dissipation at such dense computational scales would require innovative cooling solutions.

  1. Breakthroughs in information theory and computer science: New computational paradigms and algorithms optimized for atomic-scale computing would need to be developed.

  2. Precision measurement and control systems: Extremely precise instruments for measuring and controlling quantum states would be necessary.

  3. Advances in error correction: At the nanoscale, error rates could be significant, requiring highly advanced error correction techniques.

It's important to note that computronium remains a largely theoretical concept, and achieving it would require overcoming numerous scientific and engineering challenges that are currently beyond our technological capabilities. The development of such technology, if possible, would likely be many decades or even centuries in the future.

Expand on theories of computronium involve leveraging quantum effects for computation. What effects have to be leveraged to achieve computronium?

To achieve nanotechnology-level computronium, several significant technological breakthroughs would be required:

  1. Advanced nanofabrication: We would need the ability to precisely manipulate and arrange individual atoms or molecules to create computational structures at the nanoscale.

  2. Quantum coherence control: Maintaining quantum states at room temperature for extended periods would be crucial for leveraging quantum effects in computation.

  1. Error correction: Robust quantum error correction methods would be necessary to maintain computational integrity at the nanoscale.

  2. Energy efficiency: Developing methods to minimize energy dissipation at the atomic level to approach theoretical limits of computation.

  3. Scalability: Techniques to scale up nanoscale components into macroscale systems while preserving quantum properties.

Regarding quantum effects for computronium, several key phenomena could potentially be leveraged:

  1. Superposition: Allowing multiple computational states to exist simultaneously, potentially enabling massive parallelism.

  2. Entanglement: Utilizing quantum correlations between particles for enhanced information processing and communication.

  3. Quantum tunneling: Exploiting the ability of particles to traverse energy barriers for novel computational mechanisms.

  4. Quantum phase transitions: Harnessing abrupt changes in quantum systems for information processing or memory storage.

  5. Spin states: Using electron or nuclear spins as qubits for computation.

  1. Coherent quantum dynamics: Exploiting the wave-like nature of quantum particles for computation.

  2. Quantum interference: Utilizing the interference patterns of quantum waves for certain algorithms.

To achieve true computronium, these quantum effects would need to be harnessed at the highest possible density and efficiency allowed by physics. This would likely involve:

  1. Maximizing information density: Storing and processing information at the smallest possible scale, potentially at the Planck scale.

  2. Minimizing energy use: Operating at or near the Landauer limit for energy consumption per bit operation.

  1. Optimizing for speed: Leveraging quantum effects to perform operations at the fastest possible rate allowed by physical laws.

  2. Reversible computing: Implementing logical reversibility to approach theoretical limits of computational efficiency.

  3. Holographic principle: Potentially utilizing principles from quantum gravity theories to maximize information processing in a given volume.

It's important to note that while these concepts are based on our current understanding of physics and information theory, practical implementation of computronium remains highly speculative. Many of these ideas push against the boundaries of known physics and may require fundamental breakthroughs in our understanding of quantum mechanics, thermodynamics, and information theory.

How much compute would be required for Computronium?

To answer your questions about computronium and quantum effects, let's dive deeper into the theoretical aspects and computational requirements:

Quantum effects in computronium:

Computronium theories often incorporate quantum effects to maximize computational efficiency. Some key quantum phenomena that could potentially be leveraged include:

  1. Superposition: This allows quantum bits (qubits) to exist in multiple states simultaneously, potentially enabling parallel processing at a fundamental level.

  2. Entanglement: Quantum entanglement could allow for instantaneous information transfer between particles, potentially enabling faster or more efficient computations.

  3. Quantum tunneling: This effect could be used for ultra-fast switching or information transfer between computational elements.

  1. Quantum coherence: Maintaining quantum coherence over larger scales could allow for more complex quantum computations.

  2. Zero-point energy: Some speculative theories suggest harnessing zero-point energy for computation, though this remains highly theoretical.

  3. Quantum error correction: Advanced quantum error correction techniques would be crucial for maintaining computational integrity at quantum scales.

Computational requirements for Computronium:

The computational capacity of computronium is theoretically immense, approaching physical limits of computation. To quantify this:

  1. Bremermann's limit: This theoretical limit suggests a maximum computational speed of about 1.36 × 10^50 bits per second per kilogram of mass.
  1. Landauer's principle: This principle sets a lower bound on the energy required to erase one bit of information at about 3 × 10^-21 joules at room temperature.

  2. Bekenstein bound: This limit relates to the maximum amount of information that can be contained in a given finite region of space with finite energy.

Given these theoretical limits, the compute power of computronium would be staggering:

  • For a 1 kg mass of computronium:

    • It could potentially process up to 10^50 operations per second.
    • Store around 10^31 bits of information.
  • For larger scales:

    • A computronium planet could theoretically have computational power in the range of 10^42 to 10^50 floating-point operations per second (FLOPS).
    • A computronium star might reach 10^60 to 10^70 FLOPS.

These numbers are so large they're difficult to contextualize. For comparison, the most powerful supercomputers today operate in the range of 10^18 FLOPS.

It's crucial to emphasize that these are theoretical upper limits based on our current understanding of physics. Practical implementation, if ever possible, would likely fall short of these absolute limits due to engineering constraints, error correction needs, and other factors we may not yet understand.

The concept of computronium pushes our understanding of computation to its theoretical limits, intertwining advanced physics, information theory, and computer science in ways we're only beginning to explore.


Hey @taskmaster4450le, here is a little bit of BEER from @barski for you. Enjoy it!

We love your support by voting @detlev.witness on HIVE .

worth a watch. So much going on #technology #ai

The video transcript discusses a series of significant advancements and announcements in the tech industry, focusing on AI and its applications. Here’s a summary:

  1. Neural Net-Generated Doom Game: Google Research has developed a method to create the game Doom in real-time using a neural network, rather than traditional coding. This showcases the potential future of video games, where content is generated dynamically for individual users. This breakthrough could lead to personalized game experiences and even world simulators, with Elon Musk hinting at Tesla's ability to do something similar with real-world video.
  1. Cursor AI: A code editor named Cursor AI has been gaining attention. It allows users to describe what they want to build in natural language, and the AI generates the code. This tool could democratize app development, enabling anyone to create software without deep programming knowledge. VC firm a16z has made a significant investment in Cursor, highlighting its potential impact.

  2. Boston Dynamics' New Robot: Boston Dynamics released a short video showcasing their new Atlas 2 robot, which is more polished and capable than previous versions. The robot's abilities, such as push-ups and jumping, signal ongoing improvements in humanoid robotics.

  3. Reflect Orbital: A new company called Reflect Orbital proposes using satellites with reflectors to direct sunlight to any location on Earth, potentially powering solar arrays even at night. Though still in the concept phase, this technology could revolutionize energy generation.

  1. Meta's Sapiens Vision Models: Meta has introduced Sapiens, a model family for human-centric vision tasks like 2D pose estimation and depth estimation, which could enhance applications in video games, autonomous vehicles, and content creation.

  2. AI Whistleblower Protections: Former OpenAI employees are advocating for whistleblower protections related to AI development. They criticize OpenAI for deploying AI models prematurely and inconsistently supporting AI regulation, raising concerns about transparency and safety.

  3. Microsoft's AI Recall Feature: Microsoft is set to release its controversial AI-powered "Recall" feature, which allows users to search through all activities on their PC using natural language. Despite initial delays due to privacy concerns, the feature is expected to launch for testers soon.

  1. Salesforce AI Agents: Salesforce has introduced autonomous AI agents, Einstein SDR and Einstein Sales Coach, to automate sales tasks. While useful for scaling sales teams, there are concerns about the increased noise in consumer communication.

  2. Google’s Experimental AI Models: Google has rolled out new experimental AI models, including smaller variants of their Gemini series. While they show promise, some users have found their performance inconsistent.

  3. Tesla's AI Supercluster: Elon Musk shared a video of Tesla’s new AI supercluster, named Cortex, which is being built to solve real-world AI problems, likely related to autonomous driving and possibly other applications like Grok.

These developments highlight the rapid progress in AI and robotics, with significant implications for various industries, from gaming to sales and energy.

Space

Understand the differences between the missions of Blue Origin, by Jeff Bezos, and SpaceX, by Elon Musk

Blue Origin, a private astronautics company owned by tycoon Jeff Bezos, owner of Amazon, made a successful manned commercial flight for the eighth time this Thursday, the 29th. NS-26 departed Texas at around 10:07 am in the morning. Brasília and had six civilians on board.

According to Blue Origin, the NS-26 crew reached an apogee of 104 km in altitude, surpassing the Kármán Line, which defines the division of Earth's atmosphere and space. This week, SpaceX, a rocket manufacturer owned by businessman Elon Musk, also intended to carry out a commercial flight through the Polaris Dawn mission, which was postponed twice.

#newsonleo #space #technology

Although both missions are commercial, there are differences between the projects. The NS-26 flight lasted just ten minutes and was intended to cross the Kármán Line with civilians on board.

SpaceX's Polaris Dawn, on the other hand, is financed by billionaire Jared Isaacman, who is among the crew members of the Dragon spacecraft, and aims to exceed an altitude of 1400 km, something that has not been achieved since the Apollo missions, to carry out the first “ walk” of civilians in space.