The key to next-gen 3D vision for autonomous cars is … praying mantis goggles?
Researchers at the U.K.’s Newcastle University may have discovered a new way to more efficiently model computer vision systems — and it’s all thanks to a multi-year project that involves putting tiny 3D goggles on praying mantises.
“The 3D glasses we use are similar to the old-school 3D glasses we used to use in the cinema,” Dr. Vivek Nityananda, part of Newcastle’s Institute of Neuroscience, told Digital Trends. “The idea behind these is that having different color filters on each eye allows each eye to see a different set of images that the other eye can’t see. By manipulating the geometry of the images the eyes see we could create the 3D illusions, exactly like you see in the cinema. For our glasses, we cut out teardrop shapes from the color filters and fitted them onto the mantis using beeswax. They were then allowed to recover overnight, and we could try them in experiments the next day. Since we used beeswax, we could melt the wax and remove the glasses once the experiment was over.”
The glasses allowed the researchers to demonstrate that mantises have a way of computing stereoscopic distance to objects that differs from that of any other animal, including humans. Instead of comparing the stationary luminance patterns across the two eyes, as other vision systems do, mantises rely on matching motion or other kinds of change in each eyes’ view of the world.
Newcastle University
This could be exciting because detecting change simultaneously in both eyes is a simpler computational problem than figuring out which details of each eye’s view matches those of the other. It suggests that mantis stereo vision could be easier to model in computer vision applications and robotics, especially in situations where less computational power is available.
“So far we’ve been designing these systems to see and react to the world in the same way we do, but our brains are immensely complex and power hungry machines that may not be the ideal biological model to inspire efficient design,” Dr. Ghaith Tarawneh, another researcher on the project, told Digital Trends. “Mantis 3D vision uses the exact sort of computational trickery that challenges our way of seeing things: A view of the world radically different from ours, but evidently more fit for purpose. Adapting autonomous cars and drones after insect vision can give them the same capabilities: A superior ability to see the details that matter with shorter reaction times and longer battery life.”
A paper describing the work was recently published in the journal Current Biology.
Editors’ Recommendations
- Mantis shrimp are the inspiration for this new polarized light camera
- Felix Gray’s new anti-glare specs make you look good while protecting your eyes
- Apple AR glasses: News and rumors about ‘Project Mirrorshades’
- Like your whisky straight, no color? Graphene turns aged spirit transparent
- The white van speaker scam explained, and how it moved to Craigslist and Facebook
Amazon is reportedly designing its own A.I. chips to make Alexa respond faster
Do you love Alexa but think the voice assistant from Amazon responds too slowly sometimes? Amazon is working on a fix for that, according to an exclusive report from The Information. The online marketplace giant is designing its own artificial intelligence (A.I.) chips that would add speech recognition directly to Alexa-powered devices and allow the voice assistant to respond more quickly. The goal is for the hardware to be built into any device that features Alexa, including the Amazon Echo, Echo Dot, and Echo Plus.
As users of Alexa-powered devices may have noticed, there is a small delay between making an inquiry or request and getting a response from Alexa. That’s because the voice assistant must contact the cloud and interpret the command before it is able to formulate a response. By processing more data on the devices rather than in the cloud, the devices should theoretically be able to formulate a reply more speedily and operate more efficiently overall. While the device will still have to contact the cloud for more complex requests, the addition of speech recognition would eliminate the delay in simple tasks, such as reporting the time.
In 2015, Amazon acquired chip designer Annapurna Labs. The company is now producing A.I. chips that specifically suit Amazon’s own hardware needs. Amazon is also currently in the process of hiring chip engineers for Amazon Web Services, which suggests that the company may be introducing its own proprietary chips for that department as well.
The news of yet another tech giant embarking on its own A.I. chip creation poses a threat to companies like Nvidia and Intel. Both companies specialize in manufacturing chips and have shifted much of their expertise to the creation of A.I. chips. Now, many of their former customers are becoming their competition.
Amazon is not the first major company to produce its own A.I. chips in hopes of elevating its products above those of the competition. The Mountain View, California-based Google currently uses its own chips to support features like Street View, Translate, and Search. Silicon Valley tech giant Apple has also designed and deployed custom A.I. chips in its own products.
Editors’ Recommendations
- Is there an Echo in here? Amazon Echo vs. Google Home vs. Apple HomePod
- Klipsch wireless speakers will soon support Amazon’s virtual assistant, Alexa
- Amazon Echo (2017) review
- An Echo-less Alexa will hit select Windows 10 PCs in the first quarter of 2018
- Apple’s digital assistant Siri could soon be whispering sweet nothings to you
Government websites fall prey to a plugin injected with a digital coin miner
Thousands of websites relying on the Browsealoud plugin developed by U.K.-based Texthelp recently fell prey to a hack that secretly ran a cryptocurrency mining script in the background of visiting PCs. Websites use this specific plugin for visually impaired visitors so they can hear content, but on Sunday, February 11, someone managed to alter the plugin’s code to run Coinhive’s controversial JavaScript-based Monero digital currency miner.
Because it’s based on JavaScript, administrators can easily insert Coinhive’s miner into a webpage. It runs in the background while visitors browse the website, silently mining digital coins using their PC’s processor. The CPU use can be extremely apparent if you know what’s going on, otherwise, the average web surfer may simply shrug off the slow performance as typical Windows or web-based processes slowing down the machine. The mining stops once web surfers leave the offending page.
The altered Browsealoud plugin began mining Monero Sunday morning on more than 4,200 websites spanning the globe, including governments, organizations, and schools. Among them was the State of Indiana, the U.S. court information portal, the City University of New York, the U.K.’s National Health Service, the U.K.’s Student Loans Company, and many more.
Most websites typically rely on plugins to pull content and tools from third-party developers. These can include translators, shopping baskets and ecommerce, menus, and so on. But the discovery of Coinhive’s miner in Browsealoud points to the possibility that if a hacker could gain access to one plugin for malicious purposes, thousands of websites could suffer.
Plugin content typically resides on a remote server and sent to the target web page using a secure connection. The problem is that there is no real system to authenticate the actual content. Thus, someone with access to the content could easily inject malicious code, and the resulting websites using the plugin would serve up the malicious content despite registering the server as secure.
One method to fix this problem is called Subresource Integrity. It comprises of two HTML elements with an “integrity” attribute that relies on a cryptographic hash. If the number provided to the website doesn’t match the number associated by the content, then the website can catch and block the malicious code. Unfortunately, this isn’t a widely used technique, but the recent issue with Browsealoud may convince more websites to utilize the Subresource Integrity method.
Coinhive’s miner was reportedly only active in the Browsealoud plugin for a few hours before Texthelp pulled the plug. And although the outcome was apparently only to generate digital coin, the company still considers the hack as a criminal act.
“Texthelp has in place continuous automated security tests for Browsealoud — these tests detected the modified file and as a result, the product was taken offline,” Texthelp Chief Technical Officer Martin McKay said in a statement. “This removed Browsealoud from all our customer sites immediately, addressing the security risk without our customers having to take any action.”
Texthelp is currently working with the National Crime Agency and the National Cyber Security Agency to hunt down the hacker(s).
Editors’ Recommendations
- Cryptocurrency mining bot spreading via Facebook Messenger in Chrome for desktop
- Adult content domains are home to half the sites using cryptomining malware
- The best photography portfolio websites for showing off your work
- Blizzard patches security hole to block hackers from sending fake updates
- The best Black Friday travel deals of 2017
What is an APU? Should you buy one for your PC?
Bill Roberson/Digital Trends
Although Intel has offered most its central processors with onboard graphics for some time, AMD has only done the same with distinctive product lines. Where its mainstream CPUs, even the latest Ryzen offerings, don’t offer any kind of built-in graphics, its accelerated processing units (APUs) do.
But what is an APU, and is it really any different from what Intel offers with its processors? The core concept of an APU is blessedly simple: An APU combines the functionality of a CPU and a GPU, two pieces of hardware found in basically any computer.
Two halves make a whole
The central processing unit, or CPU, carries out all the instructions for computer programs. Every action you take on your computer, whether running a game or simply typing a letter, must go through the CPU.
The graphics processing unit, or GPU, is a piece of hardware that allows a computer to render images quickly. Creating 3D images often involves complex processes like rendering polygons, mapping textures, and using complicated equations involved in animation. Offloading these to dedicated hardware makes 3D images much quicker to produce.
Bill Roberson/Digital Trends
By integrating the CPU and GPU into a single unit, the APU produces a better transfer rate between the two, allowing them to share the burden of processing tasks. This also allows the APU to complete tasks while using less power than a standard CPU and graphics card setup and ensures a certain base level of graphical capability, which makes the overall user experience better. Most importantly, it means you don’t have to purchase a separate graphics card, which drastically lowers the overall price of your PC.
None of this is really any different from what Intel does with its CPUs though, even if AMD likes to call its chips that feature both cores something a little different. Most of Intel’s recent architectures from its seventh and eighth generations, combine CPU and GPU functionality on a single die. Essentially, any modern processor you purchase these days will be an APU, even if it doesn’t bear the name. That doesn’t, however, mean they’re all made equal.
But are they worth it?
Although Intel chip buyers will almost always get themselves an on board graphics die whether they want one or not, AMD buyers have a choice. Do they buy a dedicated AMD CPU — preferably from the latest generation of Ryzen processors — or opt for an APU, like the new Ryzen with Vega chips? If you’re a gamer, even on a modest budget, you’re much better off opting for one of the best CPUs with a dedicated graphics card. Although that option is more expensive, the performance offered by both an independent processor and stand-alone video card is much better.
APUs have their place, perhaps with non-gamers and those on extremely low-budgets benefiting from the lack of a need for a dedicated graphics card. However, we’ve yet to see an APU that
Editors’ Recommendations
- Intel’s ‘Hades Canyon’ NUC packs gaming hardware into just 1.2 liters
- AMD Ryzen CPUs With Vega Graphics Review
- Acer Swift 3 gets the AMD APU treatment, adding power to the bargain machine
- CPU, APU, WTF? A guide to AMD’s processor lineup
- AMD vs. Intel: How does tech’s oldest rivalry look in 2018?
Boston Dynamics’ SpotMini robot now features a terrifying appendage
If you found yourself trembling with fear at the sight of Boston Dynamics’ updated SpotMini dog-like robot when it bounded into view last November, then you’re going to suffer a full-on meltdown when you see the latest version.
The SpotMini now comes with a terrifying appendage, one so incredibly versatile that the feared robot apocalypse now seems more “when” not “if.”
In a video that’s as entertaining as it is unnerving, we first see November’s SpotMini approach a closed door. With no way to open it, the robot dog just stands there, waiting. The video’s title — Hey buddy, can you give me a hand? — hints heavily at what’s coming, with the latest SpotMini appearing in shot with that extendable appendage.
The four-legged robot extends its mechanical arm, turns the door handle and opens it, letting both robots through. But it’s not just the arm that impresses, it’s also the incredibly fluid and lifelike movement of the SpotMini.
Boston Dynamics posted the video on YouTube on February 12, and added no explanatory notes, which makes the whole thing that little bit more unsettling.
But this isn’t the first time we’ve seen the SpotMini with a mechanical arm. The original design, unveiled by the SoftBank-owned company in June 2016, was also very impressive and featured an arm similar to the one attached to today’s SpotMini.
In the first SpotMini video from 2016, we’re shown the original robot performing various chores around the house, including loading up the dishwasher and throwing away trash. It also takes a tumble on a banana skin, suggesting at least one way us humans can defend ourselves against any upcoming robot rampage.
The most recent SpotMini design appears to incorporate sensors on the front and sides of its main body that help it understand its environment. Truth is, Boston Dynamics has so far revealed little about the SpotMini, choosing instead to modify it before filming skits and posting them on YouTube.
And no, we don’t know what it has planned for SpotMini, or if it’s just a robot to showcase the team’s remarkable work.
Boston Dynamics has been developing a range of robots, each with their own skills. Atlas, for example, has the remarkable ability to perform a perfect backflip, though it’s not always so steady on its feet.
We await, a little nervously, to see what the Boston Dynamics crew comes up with next …
Editors’ Recommendations
- Apple shopping spree continues with $390M investment in Finisar
- The 20 best icon packs for Android let you refresh your phone’s style
- The best SUVs you can buy
- Prepare your power strips: ‘The Surge 2’ is coming to consoles and PC in 2019
- The car that launched Land Rover spent almost 30 years hiding in a garden
Twitter bans Congressional candidate after racist image
Twitter is continuing to act on its promise to fight hate speech, however imperfectly. The site has banned Wisconsin Congressional candidate Paul Nehlen after he posted a racist image that placed the face of Cheddar Man (a dark-skinned British ancestor) over actress and soon-to-be-royal Meghan Markle, who’s mixed race. The company said it didn’t normally comment on individual accounts, but said the permanent suspension was due to “repeated violations” of its terms of service.
Nehlen, who’s hoping to unseat Paul Ryan in the 2018 mid-term elections, has a long history of overtly expressing his racist views. Twitter suspended him for a week in January over anti-Semitic comments, and he has regularly promoted white supremacist ideology. In private, he used direct message groups to coordinate harassment campaigns. Breitbart supported Nehlen’s ultimately unsuccessful run against Ryan in 2016, but distanced itself from him in December 2017.
The candidate hadn’t formally commented on the ban as of this writing, but a Facebook post of his tried to portray the suspension as censorship.
This doesn’t mean that Twitter has become more proactive or consistent in its enforcement. The ban came when Nehlem’s racism was garnering a lot of publicity, and long after it was clear this behavior was par for the course. Still, it shows that the company is willing to take action against notable political figures when they violate its policies… to a degree.
Source: Newsweek
LG will unveil the new V30 with an AI camera at MWC 2018
LG will unveil a new version of its flagship phone, the V30, at the annual Mobile World Congress this year — and as the rumors said, it’ll have an AI-powered camera. The Korean company has developed “Vision AI” to make the phone’s camera smarter and easier to use. Vision AI can automatically analyze what you’re taking a photo of and can recommend the best shooting mode among the eight (portrait, food, pet, landscape, city, flower, sunrise and sunset) available. It can take the object’s angle and color, any reflection in the surroundings, as well as the lighting and saturation levels into account to conjure up the best image possible.
The AI also has a handful of shopping capabilities. It can do an image search or scan QR codes to show you where you can purchase the item you’re taking a picture of (along with similar products) for the lowest price. In addition, Vision AI will measure the brightness of the actual image you’re capturing with your phone in low lighting conditions to brighten parts of the image that matter.
In addition to a new camera AI, LG will also introduce a new Voice AI to alongside Google Assistant. It will add new voice commands, most of which are connected to Vision AI, on top of the 23 LG-exclusive Assistant commands. You’ll be able to ask Voice AI to take a photo in panoramic mode, in food mode or in low-light mode. It can take slow-motion videos if you ask it to, and even to do an image search or scan a QR code to conjure Vision AI’s shopping experience.
LG says these are just the first in the “suite of AI technologies” it’s planning for the V30. It didn’t say when it plans to reveal the other AI tech it’s been working on, but we’ll know more about these two by the end of February.

Source: LG
Facebook will add a Watch section for breaking news videos
Facebook may be downplaying news in your actual News Feed, but its Watch team is headed in the opposite direction. The social network has announced that it’s creating a section in Watch that will feature breaking news stories. News partnership lead Campbell Brown didn’t say when it would launch or which outlets would show up, but it recently said the News Feed would prioritize local stories. The Watch tab could easily follow suit.
There are a few reasons why Facebook might push news videos even as it reduces its emphasis on shared articles. First and foremost, of course, is money. While you may not see many original news pieces on Watch (which would give Facebook a 45 percent cut of ad revenue), this could keep many people on Facebook and thus viewing ads. When Snapchat and YouTube are big news sources for some users, Facebook might not want to risk ceding any ground. There’s also a chance that this could filter out some fake news, although bogus videos certainly aren’t unheard of.
Whatever the reasons, this will expand Watch’s role. Most of Facebook’s recent energy has been spent on securing original entertainment programming — this pushes it closer to a balance between fun and informative material. You may have a reason to come back even if you don’t care for dramas or sports.
Source: TechCrunch
AMD Ryzen CPUs With Vega Graphics Review
Research Center:
AMD Ryzen 5 2400G & Ryzen 3 2200G
Can you play PC games without a dedicated graphics card? AMD wants to suggest you can. It’s true that thousands of games on Steam or GOG have requirements low enough for even the most mild-mannered integrated graphics setup, but once even basic 3D graphics are added, performance can take a nose-dive. AMD’s latest processors aim to bridge the gap between expensive discrete GPU setups and inexpensive PCs running underpowered integrated graphics, and do it on a budget.
The new $169 Ryzen 5 2400G processor and its slimmed-down sibling, the $99 Ryzen 3 2200G, both feature onboard Vega graphics. AMD claims it’s powerful enough to run “esports games” and other less graphically demanding games. That’s a tall order for just a couple of budget processors, but if right, it’d given gamers an alternative to discrete cards that have recently skyrocketed in price. Let’s see if AMD can deliver.
What’s an APU?
These chips are a little different from your usual CPU. AMD calls the Ryzen 5 2400G and Ryzen 3 2200G an “APU,” which stands for accelerated processing unit. It’s AMD’s term for a processor with an integrated graphics chip. These new APUs feature graphics based on AMD’s proprietary Vega graphics architecture which, of course, is also found in the Radeon Vega 56 and 64 video cards.
Bill Roberson/Digital Trends
It’s scaled back, of course. The Vega 56 and 64 derive their names from the number of “compute units” they have onboard – 56 and 64, respectively. The Ryzen 3 2200G has only eight compute units, and the Ryzen 5 2400G has 11. That’s a big cut from AMD’s high-end video cards, and it of course has serious consequences for performance.
Both processors work with the AM4 platform which debuted with the first Ryzen processors. Given that these new chips are budget models, we doubt anyone is going to “upgrade” to them. Still, the fact they work with that existing socket means there’s already many compatible motherboards available. Do remember that some might require a BIOS update, so be sure to double-check the compatibility, and your motherboard’s current BIOS version, before buying.
Intel is still in the fight
While the Vega is the headline feature of these APUs, they are still processors, and handle all the tasks you’d expect for a CPU. We’ve praised Ryzen in the past, noting its excellent multi-core performance for the price. Yet we also saw that Ryzen can’t keep up with Intel’s best in per-core performance metrics. Put another way, an Intel Core chip will generally outperform AMD’s Ryzen if both have the same core count and run at a similar clock speed.
The Ryzen 5 2400G and Ryzen 3 2200G don’t change the Ryzen architecture, so that rule stands. They’re simply outclassed in raw compute performance by less expensive processors. On Geekbench, the Ryzen 5 2400G kept pace with the Ryzen 3 1300X, but AMD’s own Ryzen 5 1600 and Intel’s Core i3-8100 both eclipsed the Ryzen 5 2400G’s performance. It’s an important point because the Ryzen 5 2400G is more expensive than the Ryzen 5 1600 and the Intel Core i3-8100, but it’s not as quick as either.
We saw similar results from the Ryzen 3 2200G, with single and multi-core performance falling below its closest competitors by a slim margin. Given the Ryzen 3 2200’s affordable price tag, just $100, it’s almost impressive. But the gap between it and a much more capable and performant processor like the Ryzen 5 1600, or Intel Core i3-8100, is too narrow to ignore. You’re only saving about $30 here. That’s not a lot even if you’re building a budget gaming rig.
They fall behind some of AMD’s own less-than-expensive offerings.
These results were reinforced in our 4K video encode test, with the Ryzen 5 2400G taking about 11 and a half minutes to finish an encode that took an i3-8100 processor only 11 minutes to finish. The Ryzen 5 1600 finished in just six minutes, thanks to its superior multi-core performance. The Ryzen 3 2200G finished the encode in about 12 minutes, again coming in well under its nearest competitors.
These results are mixed. Clearly, the Ryzen 5 2400G and Ryzen 3 2200G APUs are affordable processors with adequate performance for everyday. Yet, they fall behind some of AMD’s own less-than-expensive offerings like the Ryzen 5 1600, and the brand-new Intel Core i3-8100 — which retail for $189 and $130 respectively.
AMD’s APU tries to game, but falters
Okay, so the compute performance isn’t great. That might be alright if the Vega hardware can deliver. Graphical performance is supposed to be what sets the Ryzen APUs apart from typical CPUs. It’s what makes them a compelling alternative to other, more powerful — and less expensive — processors on the market. So, let’s have a look at how they do when they’re pushed to their limit.
Starting off with 3DMark, it’s clear that the APUs are not going to come anywhere close to the performance you’d get out of a discrete graphics card. Looking at the Nvidia GeForce GTX 1050, and the Radeon RX 560 — two solid entry-level options that are usually about $160 to $180 — the performance gap is clear. Even budget discrete GPUs like the RX 550 outperform the onboard graphics packed into the Ryzen 5 2400G and the Ryzen 3 2200G.
Moving on to some real-world testing in-game testing, the results are a bit mixed, but there’s still a discernible performance gap between a budget GPU and the built-in Vega graphics in the Ryzen 5 2400G and Ryzen 3 2200G.
Just paging through these graphs, you can see very clearly that the Ryzen 5 2400G and Ryzen 3 2200G are capable, but nowhere near as capable as the GTX 1050 or the slightly-higher-end Radeon RX 570. There’s just no contest. A discrete graphics card will beat out on-board graphics ten times out of ten, even if these new APUs come close in a few instances.
Looking at Rocket League, the Ryzen 5 2400G puts on a strong showing, hitting an average of 34 FPS with all the settings cranked up. That’s impressive for an onboard GPU. The Ryzen 3 2200G comes close, hitting a playable average of 29 FPS. They both outperform Intel’s UHD 620 onboard graphics by a fair margin, but let’s move on to a more demanding set of games.
Almost any discrete GPU will offer a massive performance bump over either APU.
Battlefield 1 really lays it all out for you. At medium settings, at 1080p, you’ll get about 22 FPS out of the Ryzen 5 2400G. It’s playable, if barely. But even a low-end graphics card like the GTX 1050 quadruples that score with ease.
We see that pattern over and over in Civilization VI at medium, and Ultra, and even in Deus Ex: Mankind Divided, which neither APU can run with anything approaching a playable framerate.
Just to see if they could handle it – as it’s not part of our usual test suite — we ran both APUs through Overwatch at Low-settings at 1080p. Surprisingly, both processors managed to hit playable framerates without any significant dips in performance. The Ryzen 5 2400G hit an average of 41 FPS, and the Ryzen 3 2200G hit an average of 26 FPS. Still, while these performance numbers are playable, they barely scrape by.
These new Ryzen APUs are the first with Vega, but they’re far from the first APUs from AMD. The company has tried for years to position its APUs as a bare-minimum gaming solution. They haven’t quite managed it, however, because they don’t truly offer the bare minimum. Most gamers will need more power than what the these APUs can offer. Yes, there are budget-strapped gamers who will buy this because they absolutely can’t afford anything else – but we doubt they will be happy about it.
AMD Ryzen 5 2400G & Ryzen 3 2200G Compared To
AMD Ryzen Threadripper 1950X
AMD Ryzen Threadripper 1920X
AMD Ryzen 5 1600X
AMD Ryzen 5 1500X
AMD Ryzen 7 1700
AMD Ryzen 7 1800X
Intel Core i7-7700K
Intel Core i7-6950X
Intel Core I7-6700K
AMD A10-7870K
Intel Core i7-5960X
Warranty
Both the Ryzen 5 2400G and Ryzen 3 2200G feature 3-year limited warranties covering manufacturer defects.
Our Take
Neither of AMD’s Ryzen chips completely lose the plot, but neither excels in any way. If you’re building a gaming PC from scratch, you’re only saving about $100 by picking an APU over a CPU and graphics card — and that doesn’t justify the dip in performance. Graphics cards have indeed become expensive thanks to cryptocurrency mining, but budget cards haven’t been impacted as severely or, in some cases, at all.
Is there a better alternative?
Yes, there are better alternatives. The tricky part here is pointing out which better alternative is the best choice. You could go with an Intel Core i3-8100, or a Ryzen 5 1600, and get better overall computing performance, but without a graphics card you wouldn’t be able to get much gaming done. That said, a GTX 1050 goes for about $160, so the price of two and a half games. And as you can see from our results, a GTX 1050 offers a massive performance bump over either the Ryzen 5 2400G or Ryzen 3 2200G.
How long will it last?
Unfortunately, not very long. Both the Ryzen 5 2400G and Ryzen 3 2200G are already outclassed by processors in their price range. Even their gaming performance won’t get you very far, with either of these processors you’re one game generation away from having to roll back your resolution to 720 or lower to get playable framerates out of them. They just don’t have the longevity of even a budget processor like the Ryzen 5 1600 or Intel Core i3-8100.
Should you buy it?
In most cases, you shouldn’t bother with either the Ryzen 5 2400G or the Ryzen 3 2200G — just buy the Ryzen 5 1600 or Intel Core i3-8100 and pair them with a discrete graphics card. The Ryzen APUs are the option of last resort for budget gaming and, should you find yourself putting together a rig for $400, we’re sure you’ll appreciate them. If you game enough to identify yourself as a gamer, though, you likely won’t be pleased with the performance the Ryzen 5 2400G and 2200G muster.
Soon, you won’t have to be a Windows Insider to test Microsoft’s newest apps
Microsoft has one of the industry’s broadest and most aggressive beta programs, Windows Insider, that allows anyone to get access to the next major Windows 10 updates months early. There are some risks, naturally, but if you’re someone who can benefit from early access to new operating system features then Microsoft has you covered. But what if you just want to check out the latest Windows 10 first-party apps?
It looks like Microsoft will have you covered there soon, as well, Thurrot reports. Specifically, Microsoft is preparing a new way for users to preview early versions of apps like Alarms & Clock, Camera, and Photos. Brandon LeBlanc, who serves as a senior program manager for Microsoft’s Windows Insider Team, verified the upcoming change on Twitter:
This won't "radically change WIP as we know it". This is just us trying to make it easier for those who want to test app updates on both Insider builds or retail. More to come. https://t.co/f4gvixos6h
— Brandon LeBlanc (@brandonleblanc) February 11, 2018
Once the “Windows App Previews” program rolls out, a notification will pop up in each of the included apps that asks users to “Join the preview programme.” The text reads, “You’ll be one of the first to try out new features in the preview version of (app title) and your feedback will help shape the future of the app. Before you join, please review the preview programme details.” Check the box confirming that you’ve reviewed those details, and you’ll be able to join in.
Microsoft benefits from such a program by getting feedback from an even wider group of users. On Monday, February 12, Windows Insiders are the first to see new app features and to have the opportunity to provide feedback. Once the new program rolls out, the company could add millions of users as its beta testers. If users actively participate, then that could have a meaningful impact on Microsoft’s ability to squash bugs and continuously improve the apps.
This wouldn’t be the first time Microsoft has made its apps available for beta testing. You can sign up as a beta tester for a variety of Microsoft’s Android apps, including its Office 365 suite and Cortana. Microsoft also has its Office Insiders program that provides early access to the desktop productivity suite on both Windows and MacOS. Adding Windows 10 apps to the mix is just a natural progression.
There is no word yet on when the new program will roll out to all Windows users. If you’re looking forward to giving the program a shot, then keep your eyes open for the pop-up invitation. Just remember that beta testing comes with some inherent risks, such as data loss and instability both within the affected app and potentially system-wide. There are costs associated with living on the bleeding edge.
Editors’ Recommendations
- Windows Mixed Reality news: Here’s everything you need to know
- Insider build doesn’t want you to freak out about Windows Timeline privacy
- Like Chrome OS, but miss Windows? Here’s how to install it on your Chromebook
- Lenovo’s fingerprint scanner software is broken, update it today
- How to fix Microsoft Edge’s most common problems



