Simplest Way to Download YouTube on MacBook Air

youtube video downloader macbook air

youtube video downloader macbook air - win

Does anyone know how to download Minecraft Java edition on a MacBook Air? It won’t let me download and there are no YouTube videos.

Does anyone know how to download Minecraft Java edition on a MacBook Air? It won’t let me download and there are no YouTube videos. submitted by jnic11 to gaming [link] [comments]

The fallacy of ‘synthetic benchmarks’

Preface

Apple's M1 has caused a lot of people to start talking about and questioning the value of synthetic benchmarks, as well other (often indirect or badly controlled) information we have about the chip and its predecessors.
I recently got in a Twitter argument with Hardware Unboxed about this very topic, and given it was Twitter you can imagine why I feel I didn't do a great job explaining my point. This is a genuinely interesting topic with quite a lot of nuance, and the answer is neither ‘Geekbench bad’ nor ‘Geekbench good’.
Note that people have M1s in hand now, so this isn't a post about the M1 per se (you'll have whatever metric you want soon enough), it's just using this announcement to talk about the relative qualities of benchmarks, in the context of that discussion.

What makes a benchmark good?

A benchmark is a measure of a system, the purpose of which is to correlate reliably with actual or perceived performance. That's it. Any benchmark which correlates well is Good. Any benchmark that doesn't is Bad.
There a common conception that ‘real world’ benchmarks are Good and ‘synthetic’ benchmarks are Bad. While there is certainly a grain of truth to this, as a general rule it is wrong. In many aspects, as we'll discuss, the dividing line between ‘real world’ and ‘synthetic’ is entirely illusionary, and good synthetic benchmarks are specifically designed to tease out precisely those factors that correlate with general performance, whereas naïve benchmarking can produce misleading or unrepresentative results even if you are only benchmarking real programs. Most synthetic benchmarks even include what are traditionally considered real-world workloads, like SPEC 2017 including the time it takes for Blender to render a scene.
As an extreme example, large file copies are a real-world test, but a ‘real world’ benchmark that consists only of file copies would tell you almost nothing general about CPU performance. Alternatively, a company might know that 90% of their cycles are in a specific 100-line software routine; testing that routine in isolation would be a synthetic test, but it would correlate almost perfectly for them with actual performance.
On the other hand, it is absolutely true there are well-known and less-well-known issues with many major synthetic benchmarks.

Boost vs. sustained performance

Lots of people seem to harbour misunderstandings about instantaneous versus sustained performance.
Short workloads capture instantaneous performance, where the CPU has opportunity to boost up to frequencies higher than the cooling can sustain. This is a measure of peak performance or burst performance, and affected by boost clocks. In this regime you are measuring the CPU at the absolute fastest it is able to run.
Peak performance is important for making computers feel ‘snappy’. When you click an element or open a web page, the workload takes place over a few seconds or less, and the higher the peak performance, the faster the response.
Long workloads capture sustained performance, where the CPU is limited by the ability of the cooling to extract and remove the heat that it is generating. Almost all the power a CPU uses ends up as heat, so the cooling determines an almost completely fixed power limit. Given a sustained load, and two CPUs using the same cooling, where both of which are hitting the power limit defined by the quality of the cooling, you are measuring performance per watt at that wattage.
Sustained performance is important for demanding tasks like video games, rendering, or compilation, where the computer is busy over long periods of time.
Consider two imaginary CPUs, let's call them Biggun and Littlun, you might have Biggun faster than Littlun in short workloads, because Biggun has a higher peak performance, but then Littlun might be faster in sustained performance, because Littlun has better performance per watt. Remember, though, that performance per watt is a curve, and peak power draw also varies by CPU. Maybe Littlun uses only 1 Watt and Biggun uses 100 Watt, so Biggun still wins at 10 Watts of sustained power draw, or maybe Littlun can boost all the way up to 10 Watts, but is especially inefficient when doing so.
In general, architectures designed for lower base power draw (eg. most Arm CPUs) do better under power-limited scenarios, and therefore do relatively better on sustained performance than they do on short workloads.

On the Good and Bad of SPEC

SPEC is an ‘industry standard’ benchmark. If you're anything like me, you'll notice pretty quickly that this term fits both the ‘good’ and the ‘bad’. On the good, SPEC is an attempt to satisfy a number of major stakeholders, who have a vested interest in a benchmark that is something they, and researchers generally, can optimized towards. The selection of benchmarks was not arbitrary, and the variety captures a lot of interesting and relevant facets of program execution. Industry still uses the benchmark (and not just for marketing!), as does a lot of unaffiliated research. As such, SPEC has also been well studied.
SPEC includes many real programs, run over extended periods of time. For example, 400.perlbench runs multiple real Perl programs, 401.bzip2 runs a very popular compression and decompression program, 403.gcc tests compilation speed with a very popular compiler, and 464.h264ref tests a video encoder. Despite being somewhat aged and a bit light, the performance characteristics are roughly consistent with the updated SPEC2017, so it is not generally valid to call the results irrelevant from age, which is a common criticism.
One major catch from SPEC is that official benchmarks often play shenanigans, as compilers have found ways, often very much targeted towards gaming the benchmark, to compile the programs in a way that makes execution significantly easier, at times even because of improperly written programs. 462.libquantum is a particularly broken benchmark. Fortunately, this behaviour can be controlled for, and it does not particularly endanger results from AnandTech, though one should be on the lookout for anomalous jumps in single benchmarks.
A more concerning catch, in this circumstance, is that some benchmarks are very specific, with most of their runtime in very small loops. The paper Performance Characterization of SPEC CPU2006 Integer Benchmarks on x86-64 Architecture (as one of many) goes over some of these in section IV. For example, most of the time in 456.hmmer is in one function, and 464.h264ref's hottest loop contains many repetitions of the same line. While, certainly, a lot of code contains hot loops, the performance characteristics of those loops is rarely precisely the same as for those in some of the SPEC 2006 benchmarks. A good benchmark should aim for general validity, not specific hotspots, which are liable to be overtuned.
SPEC2006 includes a lot of workloads that make more sense for supercomputers than personal computers, such as including lots of Fortran code and many simulation programs. Because of this, I largely ignore the SPEC floating point; there are users for whom it may be relevant, but not me, and probably not you. As another example, SPECfp2006 includes the old rendering program POV-Ray, which is no longer particularly relevant. The integer benchmarks are not immune to this overspecificity; 473.astar is a fairly dated program, IMO. Particularly unfortunate is that many of these workloads are now unrealistically small, and so can almost fit in some of the larger caches.
SPEC2017 makes the great decision to add Blender, as well as updating several other programs to more relevant modern variants. Again, the two benchmarks still roughly coincide with each other, so SPEC2006 should not be altogether dismissed, but SPEC2017 is certainly better.
Because SPEC benchmarks include disaggregated scores (as in, scores for individual sub-benchmarks), it is easy to check which scores are favourable. For SPEC2006, I am particularly favourable to 403.gcc, with some appreciation also for 400.perlbench. The M1 results are largely consistent across the board; 456.hmmer is the exception, but the commentary discusses that quirk.

(and the multicore metric)

SPEC has a ‘multicore’ variant, which literally just runs many copies of the single-core test in parallel. How workloads scale to multiple cores is highly test-dependent, and depends a lot on locks, context switching, and cross-core communication, so SPEC's multi-core score should only be taken as a test of how much the chip throttles down in multicore workloads, rather than a true test of multicore performance. However, a test like this can still be useful for some datacentres, where every core is in fact running independently.
I don't recall AnandTech ever using multicore SPEC for anything, so it's not particularly relevant. whups

On the Good and Bad of Geekbench

Geekbench does some things debatably, some things fairly well, and some things awfully. Let's start with the bad.
To produce the aggregate scores (the final score at the end), Geekbench does a geometric mean of each of the two benchmark groups, integer and FP, and then does a weighted arithmetic mean of the crypto score with the integer and FP geometric means, with weights 0.05, 0.65, and 0.30. This is mathematical nonsense, and has some really bad ramifications, like hugely exaggerating the weight of the crypto benchmark.
Secondly, the crypto benchmark is garbage. I don't always agree with his rants, but Linus Torvald's rant is spot on here: https://www.realworldtech.com/forum/?threadid=196293&curpostid=196506. It matters that CPUs offer AES acceleration, but not whether it's X% faster than someone else's, and this benchmark ignores that Apple has dedicated hardware for IO, which handles crypto anyway. This benchmark is mostly useless, but can be weighted extremely high due to the score aggregation issue.
Consider the effect on these two benchmarks. They are not carefully chosen to be perfectly representative of their classes.
M1 vs 5900X: single core score 1742 vs 1752
Note that the M1 has crypto/int/fp subscores of 2777/1591/1895, and the 5900X has subscores of 4219/1493/1903. That's a different picture! The M1 actually looks ahead in general integer workloads, and about par in floating point! If you use a mathematically valid geometric mean (a harmonic mean would also be appropriate for crypto), you get scores of 1724 and 1691; now the M1 is better. If you remove crypto altogether, you get scores of 1681 and 1612, a solid 4% lead for the M1.
Unfortunately, many of the workloads beyond just AES are pretty questionable, as many are unnaturally simple. It's also hard to characterize what they do well; the SQLite benchmark could be really good, if it was following realistic usage patterns, but I don't think it is. Lots of workloads, like the ray tracing one, are good ideas, but the execution doesn't match what you'd expect of real programs that do that work.
Note that this is not a criticism of benchmark intensity or length. Geekbench makes a reasonable choice to only benchmark peak performance, by only running quick workloads, with gaps between each bench. This makes sense if you're interested in the performance of the chip, independent of cooling. This is likely why the fanless Macbook Air performs about the same as the 13" Macbook Pro with a fan. Peak performance is just a different measure, not more or less ‘correct’ than sustained.
On the good side, Geekbench contains some very sensible workloads, like LZMA compression, JPEG compression, HTML5 parsing, PDF rendering, and compilation with Clang. Because it's a benchmark over a good breadth of programs, many of which are realistic workloads, it tends to capture many of the underlying facets of performance in spite of its flaws. This means it correlates will with, eg., SPEC 2017, even though SPEC 2017 is a sustained benchmark including big ‘real world’ programs like Blender.
To make things even better, Geekbench is disaggregated, so you can get past the bad score aggregation and questionable benchmarks just by looking at the disaggregated scores. In the comparison before, if you scroll down you can see individual scores. M1 wins the majority, including Clang and Ray Tracing, but loses some others like LZMA and JPEG compression. This is what you'd expect given the M1 has the advantage of better speculation (eg. larger ROB) whereas the 5900X has a faster clock.

(and under Rosetta)

We also have Geekbench scores under Rosetta. There, one needs to take a little more caution, because translation can sometimes behave worse on larger programs, due to certain inefficiencies, or better when certain APIs are used, or worse if the benchmark includes certain routines (like machine learning) that are hard to translate well. However, I imagine the impact is relatively small overall, given Rosetta uses ahead-of-time translation.

(and the multicore metric)

Geekbench doesn't clarify this much, so I can't say much about this. I don't give it much attention.

(and the GPU compute tests)

GPU benchmarks are hugely dependent on APIs and OSs, to a degree much larger than for CPUs. Geekbench's GPU scores don't have the mathematical error that the CPU benchmarks do, but that doesn't mean it's easy to compare them. This is especially true given there are only a very limited selection of GPUs with 1st party support on iOS.
None of the GPU benchmarks strike me as particularly good, in the way that benchmarking Clang is easily considered good. Generally, I don't think you should have much stock in Geekbench GPU.

On the Good and Bad of microarchitectural measures

AnandTech's article includes some of Andrei's traditional microarchitectural measures, as well as some new ones I helped introduce. Microarchitecture is a bit of an odd point here, in that if you understand how CPUs work well enough, then they can tell you quite a lot about how the CPU will perform, and in what circumstances it will do well. For example, Apple's large ROB but lower clock speed is good for programs with a lot of latent but hard to reach parallelism, but would fair less well on loops with a single critical path of back-to-back instructions. Andrei has also provided branch prediction numbers for the A12, and again this is useful and interesting for a rough idea.
However, naturally this cannot tell you performance specifics, and many things can prevent an architecture living up to its theoretical specifications. It is also difficult for non-experts to make good use of this information. The most clear-cut thing you can do with the information is to use it as a means of explanation and sanity-checking. It would be concerning if the M1 was performing well on benchmarks with a microarchitecture that did not suggest that level of general performance. However, at every turn the M1 does, so the performance numbers are more believable for knowing the workings of the core.

On the Good and Bad of Cinebench

Cinebench is a real-world workload, in that it's just the time it takes for a program in active use to render a realistic scene. In many ways, this makes the benchmark fairly strong. Cinebench is also sustained, and optimized well for using a huge number of cores.
However, recall what makes a benchmark good: to correlate reliably with actual or perceived performance. Offline CPU ray tracing (which is very different to the realtime GPU-based ray tracing you see in games) is an extremely important workload for many people doing 3D rendering on the CPU, but is otherwise a very unusual workload in many regards. It has a tight rendering loop with very particular memory requirements, and it is almost perfectly parallel, to a degree that many workloads are not.
This would still be fine, if not for one major downside: it's only one workload. SPEC2017 contains a Blender run, which is conceptually very similar to Cinebench, but it is not just a Blender run. Unless the work you do is actually offline, CPU based rendering, which for the M1 it probably isn't, Cinebench is not a great general-purpose benchmark.
(Note that at the time of the Twitter argument, we only had Cinebench results for the A12X.)

On the Good and Bad of GFXBench

GFXBench, as far as I can tell, makes very little sense as a benchmark nowadays. Like I said for Geekbench's GPU compute benchmarks, these sort of tests are hugely dependent on APIs and OSs, to a degree much larger than for CPUs. Again, none of the GPU benchmarks strike me as particularly good, and most tests look... not great. This is bad for a benchmark, because they are trying to represent the performance you will see in games, which are clearly optimized to a different degree.
This is doubly true when Apple GPUs use a significantly different GPU architecture, Tile Based Deferred Rendering, which must be optimized for separately. EDIT: It has been pointed out that as a mobile-first benchmark, GFXBench is already properly optimized for tiled architectures.

On the Good and Bad of browser benchmarks

If you look at older phone reviews, you can see runs of the A13 with browser benchmarks.
Browser benchmark performance is hugely dependent on the browser, and to an extent even the OS. Browser benchmarks in general suck pretty bad, in that they don't capture the main slowness of browser activity. The only thing you can realistically conclude from these browser benchmarks is that browser performance on the M1, when using Safari, will probably be fine. They tell you very little about whether the chip itself is good.

On the Good and Bad of random application benchmarks

The Affinity Photo beta comes with a new benchmark, which the M1 does exceptionally well in. We also have a particularly cryptic comment from Blackmagicdesign, about DaVinci Resolve, that the “combination of M1, Metal processing and DaVinci Resolve 17.1 offers up to 5 times better performance”.
Generally speaking, you should be very wary of these sorts of benchmarks. To an extent, these benchmarks are built for the M1, and the generalizability is almost impossible to verify. There's almost no guarantee that Affinity Photo is testing more than a small microbenchmark.
This is the same for, eg., Intel's ‘real-world’ application benchmarks. Although it is correct that people care a lot about the responsiveness of Microsoft Word and such, a benchmark that runs a specific subroutine in Word (such as conversion to PDF) can easily be cherry-picked, and is not actually a relevant measure of the slowness felt when using Word!
This is a case of what are seemingly ‘real world’ benchmarks being much less reliable than synthetic ones!

On the Good and Bad of first-party benchmarks

Of course, then there are Apple's first-party benchmarks. This includes real applications (Final Cut Pro, Adobe Lightroom, Pixelmator Pro and Logic Pro) and various undisclosed benchmark suites (select industry-standard benchmarks, commercial applications, and open source applications).
I also measured Baldur's Gate 3 in a talk running at ~23-24 FPS at 1080 Ultra, at the segment starting 7:05. https://developer.apple.com/videos/play/tech-talks/10859
Generally speaking, companies don't just lie in benchmarks. I remember a similar response to NVIDIA's 30 series benchmarks. It turned out they didn't lie. They did, however, cherry-pick, specifically including benchmarks that most favoured the new cards. That's very likely the same here. Apple's numbers are very likely true and real, and what I measured from Baldur's Gate 3 will be too, but that's not to say other, relevant things won't be worse.
Again, recall what makes a benchmark good: to correlate reliably with actual or perceived performance. A biased benchmark might be both real-world and honest, but if it's also likely biased, it isn't a good benchmark.

On the Good and Bad of the Hardware Unboxed benchmark suite

This isn't about Hardware Unboxed per se, but it did arise from a disagreement I had, so I don't feel it's unfair to illustrate with the issues in Hardware Unboxed's benchmarking. Consider their 3600 review.
Here are the benchmarks they gave for the 3600, excluding the gaming benchmarks which I take no issue with.
3D rendering
Compression and decompression
Other
(NB: Initially I was going to talk about the 5900X review, which has a few more Adobe apps, as well as a crypto benchmark for whatever reason, but I was worried that people would get distracted with the idea that “of course he's running four rendering workloads, it's a 5900X”, rather than seeing that this is what happens every time.)
To have a lineup like this and then complain about the synthetic benchmarks for M1 and the A14 betrays a total misunderstanding about what benchmarking is. There are a total of three real workloads here, one of which is single threaded. Further, that one single threaded workload is one you'll never realistically run single threaded. As discussed, offline CPU rendering is an atypical and hard to generalize workload. Compression and decompression are also very specific sorts of benchmarks, though more readily generalizable. Video encoding is nice, but this still makes for a very thin picking.
Thus, this lineup does not characterize any realistic single-threaded workloads, nor does it characterize multi-core workloads that aren't massively parallel.
Contrast this to SPEC2017, which is a ‘synthetic benchmark’ of the sort Hardware Unboxed was criticizing. SPEC2017 contains a rendering benchmark (526.blender) and a compression benchmark (557.xz), and a video encode benchmark (525.x264), but it also contains a suite of other benchmarks, chosen specifically so that all the benchmarks measure different aspects of the architecture. It includes workloads like Perl, GCC, workloads that stress different aspects of memory, plus extremely branchy searches (eg. a chess engine), image manipulation routines, etc. Geekbench is worse, but as mentioned before, it still correlates with SPEC2017, by virtue of being a general benchmark that captures most aspects of the microarchitecture.
So then, when SPEC2017 contains your workloads, but also more, and with more balance, how can one realistically dismiss it so easily? And if Geekbench correlates with SPEC2017, then how can you dismiss that, at least given disaggregated metrics?

In conclusion

The bias against ‘synthetic benchmarks’ is understandable, but misplaced. Any benchmark is synthetic, by nature of abstracting speed to a number, and any benchmark is real world, by being a workload you might actually run. What really matters is knowing how each workload is represents your use-case (I care a lot more about compilation, for example), and knowing the issues with each benchmark (eg. Geekbench's bad score aggregation).
Skepticism is healthy, but skepticism is not about rejecting evidence, it is about finding out the truth. The goal is not to have the benchmarks which get labelled the most Real World™, but about genuinely understanding the performance characteristics of these devices—especially if you're a CPU reviewer. If you're a reviewer who dismisses Geekbench, but you haven't read the Geekbench PDF characterizing the workload, or your explanation stops at ‘it's short’, or ‘it's synthetic’, you can do better. The topics I've discussed here are things I would consider foundational, if you want to characterize a CPU's performance. Stretch goals would be to actually read the literature on SPEC, for example, or doing performance counter-aided analysis of the benchmarks you run.
Normally I do a reread before publishing something like this to clean it up, but I can't be bothered right now, so I hope this is good enough. If I've made glaring mistakes (I might've, I haven't done a second pass), please do point them out.
submitted by Veedrac to hardware [link] [comments]

iPad Pro - The future is now

iPad Pro - The future is now
https://preview.redd.it/c7lrdu8mk0961.jpg?width=4032&format=pjpg&auto=webp&s=2c02f62507c016770c08e9f3bcbc9554021304f2
I’ve given this a little bit of thought and wanted to write a post with a bit of depth to how I use my iPad Pro, ideas for a better setup and what I would like in a newer version of iPad Pro.
When the iPad Pro redesign was released in 2018 I had a difficult time choosing between the iPad or the new MacBook Air. I had my sights on the MacBook Air because I thought I would get more utility out of it. I was wrong..
-THE SETUP-
https://preview.redd.it/afxxyf15l0961.jpg?width=3564&format=pjpg&auto=webp&s=73230b88bfe948c54c4e0558b1e7b15933c24bed
I had the 2018 iPad Pro until I sold it to my girlfriend and upgraded to the new 2020, 12.9” iPad Pro. This is also the cellular model which I have AT&T cellular service attached to. Now, as of writing I do have the normal accessories such as the Magic Keyboard case and Apple Pencil. Another accessory I do use is my LG ultra-wide monitor that I plug my iPad into. Right now it’s in the box in the closet since my work from home iMac takes up a majority of my desk space. You may be wondering why an ultra wide? Well.. I have an app for that!
-FULL DESKTOP/SHADOW-
https://preview.redd.it/gcpvmdcsk0961.jpg?width=2990&format=pjpg&auto=webp&s=7a6155863aa78c9f9c3e62d9d11bec42f99fb8b1
As I said before I had trouble choosing between the iPad Pro and the MacBook Air at the time. I wanted a full desktop with “Pro” apps. I now use this service called “Shadow” which is a Windows PC streaming service. It works beautifully on the iPad Pro, it takes advantage of the iPad Pro’s native screen resolution and has full support for the Magic Keyboard Case. So, I have a desktop Windows computer that I can play my games(Destiny 2) or do photo/video editing almost anywhere there is an internet connection.
Not only does Shadow work with the iPad Pro screen resolution but you can plug your iPad into a monitor or tv and have the iPad as the device receiving the PC stream at full resolution even on a massive 65” TV. This has been a game changer for when I want to play games like Destiny 2 on my living room TV with my DualShock 4 controller with maxed out graphics settings. Then when I’m done I just grab my iPad and go!
Here are some tiny details that I’ve found when experimenting throughout the years with Shadow.
  1. You can do PiP on the iPad while you are using Shadow. You can’t do this with services like GeForce Now or Stadia.
  2. Shadow works fairly well with 2 bars of AT&T LTE service in the congested part of the city I live in without too many issues. There can be choppiness every few minutes for a second or two but overall it has been pretty great.
  3. The Shadow Beta client in TestFlight has support for Apple Pencil. With this you could use Photoshop on your Shadow client and it could essentially act like a Wacom tablet.
  4. You do have full trackpad and keyboard support in Shadow. So you can play games like WoW if you wanted!
-OTHER INTERESTING BITS-
This is going to sound like a sales pitch but.. DO YOU DISLIKE ADS? Especially when watching YouTube? Delete the YouTube app from your iPad, download an ad blocker and save the YouTube website as a home screen bookmark. No ads, PiP support. NEAT.
https://preview.redd.it/pp3bz18xk0961.jpg?width=3024&format=pjpg&auto=webp&s=bd684c87288b593c8b9a4b943905de864918abba
I do a lot of game streaming on my iPad. I do tend to use Stadia and GeForce Now on iPad to hurry into those high end games when I don’t have a lot of time. Stadia and GeForce Now also works with the DualShock 4 and Xbox One controllers. The only thing is that Shadow is the only app/service that uses the rumble feature in these controllers at this time.
https://preview.redd.it/vtzpxqy0l0961.jpg?width=2105&format=pjpg&auto=webp&s=d3b9ce6f6287809f541e16a8c61f561631aeda72
There are apps such as Shortcuts and Scriptable that let you automate tasks such as changing your wallpaper to align with the time of day or current weather conditions. You can use the Scriptable app to make your own weather widget, calendar widget or almost anything else you can think of. Yes you may have to do a bit of research but once you break that barrier these apps are a ton of fun to use and tinker with to customize your iPad even more.
-WHAT I WANT-
With the new iPad Pro refresh rumored to be right around the corner I am hoping for a few upgrades.
  1. 5G. For gaming streaming and Shadow PC to be more fluid.
  2. Mini or micro-led. Who doesn’t want deeper blacks and potential battery savings from this tech?
  3. Bigger or more efficient battery. The new M1 MacBooks have stellar battery. Let’s get close to the efficiency on iPad somehow?
  4. BIGGER display. This on is just out there. A 16” iPad Pro... That would be wild.
-SUGGESTIONS?-
I’ve never made a post like this so sorry for the crazy formatting. I just wanted to get this out there and out of my head. There is a lot more I can say/would like to share but I’d like to hear from everyone. Do you have any cool use cases for your iPad? What’re you hoping for in the new iPad Pro? What would you like in the new iPadOS software?
submitted by TyCox to ipad [link] [comments]

Game benchmarks for M1 Mac mini with 8gb ram inside!

Game benchmarks for M1 Mac mini with 8gb ram inside!
I picked this up and am now busy running various games. I will be posting my thoughts, results, screenshots, and settings used below. When I test games, I strive to reach a balance of reasonable FPS for the genre with decent graphics settings. I don't strive for 60 FPS lock, unless it's FPS, MOBA, or an Action game. One more thing, In most cases I recommend turning on v-sync, especially if your FPS is north of your monitor's refresh rate (typically 60 hz). It makes things a lot smoother from my experience. Finally, the built-in screenshot tool accounted for 1-5 FPS depending on the game, so your actual FPS will be slightly higher.
*UPDATE*
<11/23> I have some bad news. I decided to pull the plug on this project. Being a detail oriented person, I noticed I spend very high amounts of time in my benchmark and optimization process. And that's not even taking editing and narrating into account for the YouTube thing. And being a perfectionist, I won't settle for subpar content. To keep this up, along with other multiple real life activities I partake in, is simply unrealistic. In fact, I'm already quite burnt out, which is starting to affect my real life responsibilities. Apologies for the disappointment - I hope you eventually find what you're looking for! And thank you for all your support and kind words. They really helped me with this CVS receipt of a post.
------------------------------------------------------------------------------------------------------------------------------------------------
*PERFORMANCE TESTS*

iOS Apps: unable to resize or fullscreen windows. Default size is tiny on my desktop (3840 x 1600). For some apps, this means practically unusable. I can't imagine running iOS Civ 6 or XCOM 2 like this. Screenshot on Roll for the Galaxy attached for reference.
Steam: Runs...laggy via Rosetta. If you have a lot of games, scrolling your library has significant lagginess involved and also jumps around. Not sure if Valve will address this. Edit: tried “Disable GPU acceleration and smooth scrolling” and it's fixed!
Civ 6 (Steam): This mini is a silent beast! Fan didn't even kick on wow... Late game using built in benchmark gives me 30 fps minimum 54 fps maximum. Detailed settings used attached. Onto next game!
This War of Mine (Steam): Starts loading then crashes to desktop...
Offworld Trading Company (Steam): This game is a niche favorite of mine. Notorious for being poorly optimized and hard to run. Averaged 23-24 FPS using decent settings - which I consider really good for the game. Fan dead silent. Is my fan broken?? I'm very impressed.
The Long Dark (Steam): Minimum 46 FPS, maximum 76 FPS indoors. Settings and resolution in attachment. Fan still silent. I touched my Mac and some parts of it are cold??? Post FX settings and resolution had the biggest effect on performance. V-Sync doesn't seem to cap FPS...
League of Legends: Runs extremely well. Although v-sync is turned on, it doesn't seem to cap the FPS. Also noticed it in The Long Dark. I was getting FPS in the 60's. Settings posted below. Also, built in screenshot tool wasn't working, so it's via phone. The lobby client doesn't seem to detect that you finished a game when you exit. You may have to restart client between games.
Dota 2: This was actually tricky to find the optimal settings. I was aiming for 60+ FPS with some eye candy during moderate action. Although it can play higher resolutions with more sparkle, the FPS suffers. In the end I resorted to the settings attached in the screenshot. Also, use of the Vulcan API seems to still make a positive difference over the OpenGL in the game settings.
Cities: Skylines: I didn't have an already built city to test and it was taking forever to grow my village to a decent size - so only early game testing was done. Also, there was only one 21:9 resolution available, which means I couldn't tune down resolutions in the 21:9 ratio. I did three separate tests on this, one on 16:9 resolution and two on 21:9. Game ran decently well on the fastest game speed, but I assume late game will bog it down somewhat. Detailed stats below.
Stellaris: Starts loading but hard freezes at 5%. Music still plays and mouse movable, but Command Q and Command tab were both not working. I had to force restart the mini to exit. Edit: apparently this is a known issue. I will retest next week on MacBook m1 /16 gb (already uninstalled. This machine is temporary)
World of Warcraft: Ran natively. Performance was insane! Screenshot tool had some impact on this game, but I made note of it in the comments. It also hard froze once while changing settings. I'm level 5 so I didn't get to bench massive fights. One of the few games I can max pretty much everything at 3840 x 1600 and still get 70+ FPS. I actually turned that down in benchmark images to account for bigger fights. Best part? Dead silent. Cold to the touch at some parts. :)
CS:GO: Does not make it to the main screen. Entire screen is black.
FTL: Runs! No screenshots or FPS testing - was strictly for compatibility.
Into the Breach: Runs! No screenshots or FPS testing - was strictly for compatibility.
Starcraft 2: The performance difference between medium and low shaders was dramatic. I tried to keep medium because it was notably better (realistic shadows, lighting, etc). In the end, given the game's competitive nature and propensity for bigger engagements, I lowered down the settings. In very early stages of 2v2 I averaged 90-110 FPS. Medium stage was in the 60's. Brief dips to 40-50's occur sometimes. I imagine late game will run mostly near high 40s to 50s. I also noticed M1 has some sensitivity to post fx more than most graphics cards. During the time I used medium shaders, looking at Protoss spawning graphics took away 10-20 FPS. I also read that online play against real people will force 16:9 ratio for 'fairness.' I feel Blizzard's logic is a bit flawed, but I can see how they arrived at this. (reminder: screens are now on Imgur due to Reddit image limit)
Diablo 3: Low 60 to High 80 FPS depending on activity. Again, I recommend turning on V-Sync, it'll make your gameplay smoother. Images posted on Imgur. Settings weren't maxed, but they were still decent. Settings were turned down because I was aiming for 60 + FPS regardless of activity level
CK2: Getting the error message, 'You do not have permission to open the application' despite being an admin user. The googled solution of manually changing executable read/write setting did not resolve the issue. Edit: Information is flying around that a beta Mac update fixes the issue. I will reinvestigate!
Hollow Knight: 3840 x 1600, maxed everything and runs at 100 FPS without v-sync. I recommend leaving v-sync on when playing. The excessive FPS (my monitor is 60 hz) was causing choppy gameplay if I left it off.
Kerbal Space Program: I don't have much experience with KSP, but after seeing how playable this was, I'm going to address that. First, I wanted to see how it would run with default settings and from orbit. The resulting graphics were gorgeous with acceptable FPS, but I knew it wouldn't cut it for all scenarios. I still took a picture because it was amazing. Second scenario I loaded confirmed my suspicions, so I worked on fine tuning the settings. After much tinkering, I found a setting I was happy with that worked with above decent frames for most (all?) scenarios. Screens were taken of these too. Even if you don't play KSP, please check the first picture - it's amazing.
Minecraft: Ran extremely well at full resolution (3840 x 1600) with high settings. I could not set the graphics to 'fabulous' because the game would crash. I increased the render distance to 14 chunks because it was performing so well. Low's were in the 60's and high's were in the 80's. This was from flying high above and zooming around to see if I could tax the system. It didn't turn on the fan. In fact, I still don't know what my fan sounds like. I expected the Mac to download java, but it seems like it was already installed. I haven't tried to fancify it further via 3rd party graphics plugins yet. Screens on Imgur, as noted below.
Minecraft with fancy shaders:
Factorio: Game was benched using a 'megabase' (extremely large factory) with no mods. Default settings ran decently considering the huge amount of things going on. Lowering the sprite resolution from high to normal was necessary to hit average FPS of high 30's. Resolution was set to 3840 x 1600 and view was zoomed all the way out. I tried to set the view on a busy part of the map. Images are on Imgur (find Waldo!) and save file is here (Reminder: you have to right click image on Imgur and open in new tab to view the full resolution): https://www.reddit.com/factorio/comments/gely3v/20000_science_per_minute_hybrid_modular_megabase/
------In an effort to reduce redundancy, games below this line will only receive previews before its video counterpart-----
Dolphin: Dolphin is a Wii emulator and the performance depends on a lot of things, including the game being run. I decided to choose a relatively demanding game, Super Mario Galaxy 2. While it's possible to run this well with minimal quality, I wanted the most optimal setting possible for the mini, even on the more demanding scenes. (60 FPS target, with as much sparkle I can enable) In an effort to reduce redundancy and confusion, I won't cover all the variable changes, notes, and warnings here. Some settings were not as they seemed, and I don't want to confuse users new to advanced emulation. But gameplay screens have been posted to show you what you can expect running with an optimal, performance-oriented setting.
XCOM 2: M1 Macbook 16 gb arrived sooner than anticipated, so I benchmarked this game on the system. I was anxious to test one of my favorite games, and I may have spent a little too long trying to optimize this. XCOM 2 was infamous for being poorly optimized so I expected it to run poorly - especially the 'Lost' level where hordes of zombies surround you. Sadly my expectations proved correct. This has been one of the worst performing games so far. Still playable, but I managed around 25-30 FPS for that level. Screenshots show low 30's, but that's due to camera being held static. During explosions it would drop even further, sometimes to 15-19. I also heard the M1 fan eventually spin up and get warmer for the first time - so it's reasonable to infer that the fanless MacBook Air would have a harder time running this for extended durations. Resolution was 1440 x 900. Although performance could be improved a bit by lowering the resolution even further, I consider 900p the absolute minimum for pleasant gaming. FPS, though important, is not everything that makes a game 'feel good' - even for a turn-based game. It's the balance of screen resolution, FPS, and different quality settings that forge the feeling that we all desire. Screens have been posted on Imgur and I intend to go more in-depth in the follow up video.

------------------------------------------------------------------------------------------------------------------------------------------------
Crossover Performance: I made a separate section for Crossover, because this requires additional tinkering of the wine environment - sometimes on a per game basis. Afaik, this is currently the only way to run x86 Windows games. I will try to explain what settings I used on all fronts to guide new users.

------------------------------------------------------------------------------------------------------------------------------------------------


iOS app Roll for the Galaxy. Unable to resize or fullscreen
Civ 6 - minimum 30 FPS
Civ 6 Tried to catch Max. Saw 54 FPS
Detailed settings used via fullscreen
Offworld Trading Company - early/Mid game, averaged 23 FPS
Detailed settings. Changing them didn't matter much for this game. (Game known for being unoptimized and hard to run)
The Long Dark - indoors FPS 76 FPS
Outdoors FPS 46 FPS
Settings part 1
Settings part 2
League of Legends - Screenshot tool unresponsive. FPS in the 60's. V-Sync doesn't seem to cap FPS.
Detailed settings. V-Sync was turned on. If you get a black screen while changing settings, tabbing out and back in seems to fix it.
Dota 2 - Had to lower quality to achieve near 60 FPS stable during significant action.
Detailed settings. Note the Vulcan API.
Cities: Skylines - early game. Don't mind my village lol
Detailed 21:9 settings. Only one 21:9 resolution available so I had to lower most other settings down to keep the resolution.
2nd try with even lower settings. Not much difference - at least on the early game stage
Setting used for 2nd try
3rd try, this time using 1920 x 1080. It is ran on windowed mode because fullscreen would stretch to fill my entire screen. Windowed mode may have a slight negative effect on performance.
Detailed 16:9 settings
Ran into Reddit image limit. Rest of the images are posted on Imgur. Also, using mobile to access Imgur user post page does not work... You need a computer browser unfortunately.
https://imgur.com/useKiPhish/posts
If you want the full experience of every pixel count (I run high resolution), you'll have to right click the image on Imgur and open in new tab.
Edit: Also, thank you for the gold and all the awards! And a plat??? Wow, what do I do with all these?
submitted by KiPhish to macgaming [link] [comments]

YSK: If turning it on and off again doesn't solve your tech problem, a PROPER google search will almost always get you the answer on the first page.

Why YSK: Knowing how to search for an answer can save you hundreds of dollars paying someone else to solve your problem and reduce downtime waiting for it to be fixed. A majority of tech problems can be solved using a little bit of googling.
I used to work selling and repairing electronics (computers, laptops, tablets, phones, printers, etc) for a couple years and was baffled by how much people would spend because they refused to learn how to search for the answer themselves. It wasn't rare for a customer to have a problem that I solved by 5 minutes of google-fu. Especially now as money is tight for a lot of people and work at home is becoming a big thing this is probably more relevant than ever.
Since how you should google depends on the problem this is more going to be a list of general guidelines so some may not apply to every type of problem.
  1. ALWAYS begin or end your question with the model of device the issue is occurring on. Examples: "HP Spectre x360 13-aw0090ca no audio input", "How to download pictures Samsung S8", "Epson WF-7820 black streaks when printing", etc.
  2. If there is an error code for the love of everything holy do not click away from it without reading it. I have seen many people bring machines to me where they show me what's wrong by replicating the problem and then immediately closing the popup with the error code that says exactly what the issue is. I think it is reflexes for most people to close pop up windows so try to resist it.
  3. If you have an error code type the code along with the device model (a la point 1). You'll almost always find an answer that way. Example: "Epson WF-2320 how to fix Error code 0x97"
  4. If you built or worked on your device yourself and it is malfunctioning utilize resources like Tom's Hardware No Boot Guide or forums like Linus tech tips for more specific advice.
  5. If you have to repair or replace a physical part of your device IFIXIT has guides on almost every popular device that include step by step instructions, pictures and links to buy repair kits if need be. Search "DeviceName DesiredRepair ifixit" in google to find what you are looking for. Examples: "Macbook Pro 2013 Battery Replacement ifixit" "samsung s5 screen repair ifixit" "Nintendo Switch Joycon drift fix ifixit"
  6. If google doesn't yield results try YouTube. I've had many my butt saved by finding a fix on a YouTube video. This is also better for visual learners
  7. If you have a software issue always begin or end your search with your operating system. Windows: Windows 10, Windows 7. Mac: MacOS or MacOS VersionName | Examples: "100% cpu usage Windows 10", "Change monitor resolution MacOS", "iMessage not syncing MacOS Mojave"
  8. If you are having an issue with a certain program always begin or end your search with the program name at the beginning or end. Examples: "Microsoft Outlook emails won't load", "Microsoft Word can't edit document", "No microphone input Macbook Air Zoom"
  9. If you don't understand technical terms (like cpu, gpu, post, ram, etc) in the answers you find spend a couple minutes googling the terms to learn what they mean and how they work.
  10. If you are getting lots results that include stores or etailers add "-store -buy -new -amazon -ebay" to the end of your google search. That will remove most product listings from your results. Just make sure to not to include the quotation marks.
  11. If you're having issue with a device like a printer, scanner, smart speakers, or a lot of other non-computer devices, looking up the manual (also called a User's Guide) can solve a lot of problems. Most devices only come with a paper "Quick Start Guide" not a full manual. Most devices, especially printers, have EXTREMELY detailed manuals which address almost every imaginable problem and error code you could encounter and how to fix them, along with how to set up and tweak every part of the device. The best way to search for this is to search "DeviceName Manual .pdf" Examples: "Epson WF-100 manual .pdf", "HP 3830 user guide .pdf", "Bose home speaker 300 manual .pdf"
  12. If you aren't sure what a port or cable is called these two guides Here and Here show just about every type of cable or port you'll run into. Knowing the name of the port or cable can change your search from an ineffective "blue rectangle port doesn't work ModelName" to "USB 3.0 type A port doesn't work ModelName". The latter is much more effective.
  13. Adding "guide" "walkthrough" or "fix" to your search can help narrow down the results if are aren't getting relevant results.
If I am missing anything else let me know and I'll add it to the list
Edit: Adding more suggestions from comments below
Zip is a common file format. Zip is often used in X, Y, Z. Opening zip may result in X. To open zip use our FREE super-fileconverter-not-a-virus.exe. Download here!
or
MOV is a common video format used by Macintosh systems. To convert MOV to x, y or z, download our FREE NotAVirusConverter at link
Off topic pro tip: If you really want a program to open .zip or .rar files that isn't build into Windows or MacOS there are only three good options I've found. Windows: 7Zip, WinRAR | MacOS: Keka
submitted by Canuck0987 to YouShouldKnow [link] [comments]

Day 2 of M1 MacBook Air (Basemodel)

Day 2 of M1 MacBook Air (Basemodel)
Today I was mainly tracking battery doing some video editing, YouTube, downloading of files, and Updating macOS.
These tests were all run at full brightness!
First thing I did was watch YouTube from 8:19-9:28, overall the videos were all 1440p at 60fps and a few 1080p at 60fps, playback was smooth and didn’t stutter at all. The MacBook lost only 7% of battery over a 49 minute period. That’s 0.14% a minute! The MacBook was very cool during this time.
Moving onto a more strenuous task, Video editing in Premier Pro running under Rosetta 2! I ran this from 9:28-11:24. I was editing 1080p 60fps footage with colour correction and effects applied. I also had 1-7 Safari tabs open at a time. The MacBook lost 31% of battery over a 1 hour and 56 minute period! That’s 0.27% a minute. The MacBook was warm-ish and was able to scrub and playback perfectly.
From 12:58-13:16 I was updating my MacBook to macOS 11.2 beta 1, it failed at first but a restart fixed the issue. After trying the update again I went out, the update lasted (an estimated) 35 minutes with the download. When I arrived back home the MacBook lost 11% of battery.
The MacBook was idle between 13:57 and 16:23 in this time it lost 4% of battery.
In the last few tests I did, Epic Games was running in the background installing Fortnite (bear this in mind as Epic Games is quite battery consuming)
I continued some more 1080p 60fps video editing in Premier Pro from 16:23-17:32, at first video scrubbing and playback was perfect with effects and colour corrections but after around 25 mins it started to stutter and needed rendering occasionally for playback. Creative Cloud was running in the background installing After Effects, and Media Encoder. Safari was also open with 4 tabs. The MacBook lost 29% of power while remaining moderately warm. It was losing around 0.57% of battery a minute at this point.
And then finally, I was once again watching YouTube from 17:33-18:12. Perfect playback and the MacBook cooled within 5 minutes of stopping video editing. I kept Epic Games open during this time too. The MacBook lost 13% of battery during this time while remaining cool to the touch. It lost around 0.32% of battery every minute at this point!
I stopped the test at 10% battery as I didn’t want the fortnite install to stop (look out for the benchmarks on that tomorrow!)
If you made it to the end, I hope you enjoyed reading and found it useful :)
submitted by ShadowStonk to macbookair [link] [comments]

Apple M1 MacBook Air review from a dedicated PC user

Apple M1 MacBook Air review from a dedicated PC user
Updated 11/30/2020 for 8GB versus 16GB questions. See end of review for the updates.

Before I delve into the actual M1 MacBook Air review, just some background. I've worked in the PC industry from the early 80's. I remember the 1985 Apple vs Microsoft lawsuit related to the Apples significant anger over Windows. I could go into great detail on all the changes that have occurred since then, but outside supporting company Apple users on the network environments over the years, I'd largely continue my primary focus for the desktop realm on the PC. Besides a formal job as a network engineer for several world wide organizations, I had a strong hobbyist passion for photography and eventually videography.
Fortunately there's a very good non-linear video editing solution from BlackMagic called DaVinci Resolve. As BlackMagic is a hardware vendor of mid range to high range videography hardware, DaVinci Resolve was released with both a free version as well as the paid studio version. Purchase of their cameras would come with a studio activation key. With a significant number of production feature length films created by DaVinci Resolve along with sales of special control surfaces for the environment, it was beneficial to provide a free version to wet the appetite of the newly introduced video enthusiast. This resulted in my putting together a fairly powerful PC at the time to edit, what was astounding at the time, 1080P workflow. Regrettably I'd sized the VRAM in my graphics card less than I'd need to be able to use for a 4K workflow and beyond. With the purchase and use of a Panasonic Lumix GH5, V-LogL license and related gear, I'd frequently need to send my footage to friends with newer systems to process. Finally had enough to start looking at putting together a new workstation and the resulting costs. With kids still in high school and learning to drive, I didn't have a budget I'd prefer for a PC workstation that could extend to 6K if I use anamorphic lenses on the GH5 (couldn't afford both an expensive anamorphic lens and new workstation).
Fast forward to November 2020 and the release of the Apple M1 silicon. This arrived at the time that I had started my planning. Fortunately while researching DaVinci Resolve hardware requirements, I began to see a number of new M1 videos showing the use of DaVinci Resolve with a base model M1 Mac mini with 8GB of ram. It was working timelines including those with Red 8K Raw exports and other workflows that would result in out of GPU memory errors if I'd even attempted it on my own workstation. And for $699 USD? What magic is that and my skepticism was so high and I also had the hurdle of not being an Apple user.
Time to actually try it out however before I'd commit to the eventual budget damage of a higher-end PC workstation for 4K and 6K workflows without compromise. The Model of Apple M1 MacBook Air is the 8 core CPU/ 8 core GPU with 8GB of RAM and 512GB of SSD storage. Figured that to ensure best performance, without committing to a more expensive 16GB model, the one with the 8 core GPU was a good selection.
Packaging experience:
Most of the PC packaging may have pretty packaging, it also has a large amount of specifications and other details to impart to the perspective purchaser. Apple, expecting you to already know this detail from their presentations and likely the press releases of the new product, has a much more simplified packaging experience. Also seems that the iPhone packaging helped with the design decision as there is a simple elegance.

https://preview.redd.it/q19lj0szyt161.jpg?width=5184&format=pjpg&auto=webp&s=b7b3baf400b5475c7fe8ffd58d488070caa1c917
Opening the packaging reveals the beautiful (Gold in this case) MacBook Air and associated accessories included, charging cable and charger.

https://preview.redd.it/np2dm2u5zt161.jpg?width=5184&format=pjpg&auto=webp&s=83cc0faea7398ead4d0b66c6a2397b48f20881ac
This minimalist packaging design struck me as I'm used to an almost overwhelming amount of detail and hype on PC laptop packaging. Also the internals is usually less eco friendly styrofoam that still looks very utilitarian. The 100% recyclables that are also elegant in their design is a refreshing packing experience.
Start up:
Oh my, simply lifting the lid started the setup experience and I was welcomed with the Macintosh sound I remember from 1985. Too cool! But more importantly, the process from start to completion was extremely easy and also extremely fast. On new Windows 10 workstations, there's this very long start up process for the first time that always seems to be longer than necessary, although future starts of the Windows 10 workstations are pretty quick with the use of SSD drives these days.
There's a rice or some other type of paper protection on the display, once you peel it off the True Tone display is like \"BAM\".
I've used a number of Lenovo and Dell laptops over the years. The keyboard on the new M1 MacBook Air is extremely comfortable to use and has surprisingly crisp and responsive keys. I was worried about the mushy experience you'd encounter on non-Apple products, but after using the keyboard for several days, I can clearly state that fear was unfounded with the M1 Air.
Now on to install my applications. I have cloud subscriptions to a number of products, from Microsoft Office 365, Adobe Creative Cloud and more. Most of these applications at this time are not Native or "Universal" applications, meaning coded specifically for the Intel based Macs. While I knew that Rosetta 2 translation was available for the new "ARM" based architecture (https://en.wikipedia.org/wiki/ARM_architecture for those that like to dive into the details of the ARM architecture world), I was not expecting the performance I encountered. After all, while not a Mac user previously, I remember the frustration from those that moved from PowerPC based Macs to Intel Macs and the translation performance hit. Rosetta 2 apparently does the translation before you start using the application, this is why it'll bounce longer on startup, but once running will frequently run faster than the same application on an equivalent Intel based MacBook. That's pretty astounding, I'd never experienced a translation that wasn't at best hopefully only slightly slower than the prior environment.
On to some native application testing. BlackMagic released the beta DaVinci Resolve 17.1, be sure if you are also trying it to ensure it's the one specifically for the M1. The normal download page doesn't link to the M1 Apple Silicon version, you need to search for the latest beta release for the M1 from the support page instead.
I use the Studio edition, so here's the details during the install. What's really incredible to me is how self-contained the actual application is. On my PC, it would take significantly longer to install than the Mac and massive number of .dll and other filetypes would be installed. I'd tested installing benchmark apps and removing them to see that it's a single file drag to the trashcan. That's quite a difference for a PC user.

DaVinci Resolve 17.1 beta 2 for Apple Silicon
Performance in the primary reason I was testing out the M1 MacBook Air, a full 4K timeline with titles, music and color grading. On my older workstation I'd have to perform a 1080P timeline, even though the media used was 4K 10-Bit 4-2-2 HLG from my Panasonic GH5. With the 8G RAM, I was able to do the full 4K timeline with no reduction in quality. Not only that, no proxy (older system was 1/2 resolution proxy) nor optimized media and I was able to work with the 4K timeline with no stuttering. Unlike other videos I want to be clear that I was not editing in ProRes. I then color graded with the well known Leeming LUT for GH5 and HLG. Still totally smooth.
Next I wanted to process some images and put them into the video as well. Just to see if the memory efficiency was hype, I loaded Adobe Lightroom Classic at the same time! This application is still Intel x86 and Rosetta 2 installed it self prior to the application running. While the 4k Timeline was running without stutter I was able to apply a series of processing to the photo and even export it. Check it out!

DaVinci Resolve 17.1 M1 edition playing timeline (see red play) with multiple 4K 10-bit 4-2-2 HLG clips along with me editing a high resolution photo in Lightroom Classic
Oh my word!!! I cannot do this without stuttering individually on my PC workstation with an i7-6700, 32GB RAM and an Nvidia 1050 with 4GB VRAM. Let that resonate for a minute or two... A MacBook Air with only 8GB of memory does it with no hesitation with a memory amount that would prevent the related applications from both loading let alone work well on my PC if I didn't have 32GB of RAM.
In my testing, I completely lost my mind! For decades I'd been a staunch PC or nothing enthusiast and if this is the state of the Apple universe with the new Apple Silicon I'm now a strong convert.
One concern was left. A number of tech reviewers said that the MacBook Pro would be the better choice because of the fan and that there was a lot of concerns about the Air thermal throttling, especially based on the history of the Air series. As a non-Mac user, I'd not been aware of that concern and jumped into the completely silent Air. Apparently if you run Cinebench R23 and use a very long (throttle test), you can see that it does throttle somewhat. But interestingly the throttle is less than you'd expect and it seems that it's able to damp the thermals quick enough you start to see that the performance shouldn't drop too much further. In fact testing that was primarily GPU rather than CPU related didn't appear to trigger throttling in my experience. I certainly didn't experience it while working with DaVinci Resolve for several hours as I tried a number of the different functions as well as exploring Mac and PC version differences (not many, it's primarily Windows versus osX differences).
In summary, once DaVinci Resolve 17.1 moves out of beta, I'll be using this new Apple M1 MacBook Air as my primary workstation for 4K and higher editing workflows. I'm also excited to see the eventual "Universal" versions of my other applications and what additional performance I'll possibly see once those applications are native to the M1 rather than Rosetta 2 translated.
If you also happened to be a former PC user that was encouraged by the new M1 series of computers, I'd love to hear about your own experiences in the thread below.
As for those that have a problem deciding on the memory capacity and have the extra budget available, by all means purchase the 16GB versions of the M1 architecture. I'm sure that will help ensure even greater longevity for the platform. As for myself, I figure when the 8GB no longer meets my workflow needs, the next couple of M series processor revisions will have been released by then and I can get the next inexpensive model to upgrade to.
8GB ram testing and thoughts on capacity regarding DaVinci Resolve 17.1 beta for Apple Silicon.
Several have asked me since I originally posted this review my thoughts on 8GB vs 16GB versions of the Apple M1 MacBook Air. Here's some testing that I performed to see what may be an answer to that question. And the answer will also have the "it depends" viewpoint on it.
First off, I'd been using 4K 10-bit 4-2-2 HLG clips that I color grade with the Leeming LUT for Panasonic GH5 HLG (seems to have less noise to me than the V-Log L, but that's a conversation for a different thread). In addition, I have Titles and Fusion features throughout. What's interesting is that during my testing, I don't have any slow downs in the timeline playback or when moving around the Cut work page. I did open the Activity Monitor to see what's happening and as you can see, DaVinci Resolve is using more memory than the laptop has installed physically. The M1 system is able to very quickly and deftly handle this so that I don't actually notice any performance degradation. Also memory pressure during this process is good in the "green" so to speak. However, this may be an area that for those that do much more professional projects than I with longer timelines may want to look at the 16GB model. I personally believe the 8GB can handle it fine, but the question is with swap writes to the built-in SSD frequently, how soon could one reach the drives TBW (terabytes written) rating? Once you reach that, eventually the drive will fail as TLC (likely if it's Western Digital SSD chips as shown in the iFixit Teardown of both an M1 MacBook Air and an M1 MacBook Pro) and other NAND based SSD technologies can only write to a cell a set number of times before it fails (although current tech is 3000 times or more per cell). SSD makers allow for additional "reserve" cells that are than mapped in to replace failed cells. TBW rating also goes up as the storage space increases, specifically because there's even greater number of reserve cells available. This could be the argument to purchase a 16GB model if you frequently will be doing video editing and have a possible concern about longevity. That said, swap space has been happening on Mac and PC systems with SSD for some time and not too much has been said about related failures. A 16GB would certainly page less as the ~10MB utilization of DR with this much going on would fit entirely in the physical RAM.
Activity Monitor with DaVinci Resolve using color grading and fusion
Besides my own testing, there's a brand new Youtube from "Learn Color Grading" from Filmsimplified.com that tests DR 17.1 with content all the way up to 12K in a 4K timeline using BlackMagics highest resolution camera. Spoilers - it works on both https://youtu.be/TrlpuvHg_Ig
So in summary for the 8GB versus 16GB question? I believe for many content creators, a MacBook Air with 8GB of RAM will be enough for most of your normal VLOG activities and an amazingly low price. However if one is wanting to ensure the longevity for their purchase, then 16GB will help ensure less swap activities to the SSD drive. Also 16GB should be selected if using DaVinci Resolve 17.1 beta for very large projects. The conclusion here would be, if you can afford the additional $200 and willing to wait for the product to be shipped to you, get the 16GB editions of either version. For everyone else, getting the 8GB will save you $200 that you can use for must need accessories or AppleCare+.
Finally, I had mentioned some bench marks. There are a number of reviews that already show this, but I ran multiple times and captured images of the results. The ones I post below are not technically "average" but the result that I would see the most frequently when I'd run it multiple times and compare (may be less than others and if so that's a measurement of the heat impact for me).
Both scores and the version of Geekbench

Single-Core comparison

Multi-Core Comparison
Overall, the 8GB M1 MacBook Air is doing very well for me. I feel comfortable from a Adobe Creative Cloud (Photography edition) and DaVinci Resolve 17.1 beta user with a standard 4K timeline not worried about SSD longevity recommending an 8GB edition. For all others that worry about SSD longevity or must squeeze every bit of performance out of a native M1 NLE video editor, get the 16GB.
submitted by SeaRefractor to mac [link] [comments]

About to pull the trigger on Pro Tools but first need advice/opinions from strangers

Dear council of Pro Tools...
Fairly long time Logic Pro user. Used it since Logic Pro 8 on and off for fun with friends growing up but bought LPX in 2017 and have used it regularly ever since, semi professionally.
I see friends, the engineers and mixers that I look up to use Pro Tools. When I watch YouTube videos, or follow courses for mixing, it's Pro Tools and I spend a lot of time trying to figure out how to convert the workflow I see, over to Logic Pro language etc.
I downloaded the free trial today and I'm going to spend every one of these 30 days trying to learn something new on and see how my workflow adapts. My thinking for all of this is, shall I continue to figure stuff out on Logic or shall I cut the middle man and go straight to Pro Tools which seems to still be the industry standard of studios... although I'm sure this a big debate in itself these days.
So my questions are:
Thank you in advance. I'm sure this sub gets this sort of question all the time and although I did do a search and didn't find answers to all my questions, hence this post. Cheers!
submitted by Plexi1820 to protools [link] [comments]

Matebook 14" 16GB + 512 GB SSD 2 weeks usage my experiences.

Hello, recently i bought Matebook 14. Thought it would be helpful for those looking to buy this device. My usage scenario is: I'm the guy who uses dark mode almost everything. And i turn low blue light all time.. except tv shows, movies etc. Also i use display around %0-30 brightness. %95 of time i use my laptop at my study desk, reading pdfs, working on programming languages, toying with linux terminal producing errors, trying to fix them etc. Usual stuff.
Keyboard has nice typing experience, i used apple keyboard at local store to test. I find them nearly same. But it has unique quality, so i cant compare it with macbook air. Both has different typing experiences both are good. It has satisfying 2 levels of backlight(white). There was nothing to complain about. The light turns off after 15 seconds. But this can be changed in PC Manager App which comes pre-installed with the laptop.(5-10-15 secs; 1,5 min.;or never).Also the keyboard has priority to function F keys. Which means you can only use F keys when u turn on Fn key(the light turns on when its on; just like caps lock) . But than again, this can be changed in the PC Manager App. You can do vice versa.
Touchpad is wide and comfortable to use. Sometimes i switch BT Mouse(huawei's) but quickly i turn back to touchpad. Its even better than seperate mouse.(According to my experiences)
Display has very nice contrast ratio. For dark mode it satisfies me well enough. The screen flickering and stuttering is not problem for me. Colors and contrast reminds me my samsung tab s5e tablet(oled display without official HDR support) It comes with Monitor Management app which reduces blue light. Its a bit better then native windows night mode (its more reddish). I have to mention thoug its screen resolution is 2160x1440 windows re-scales with %150. It may seem a bit unnecesaary to use at this res. But texts, images becomes sharper comparing to my 1080p Asus VG249Q 24" monitor. So yeah i think its better that way. Also i had concerns about backlight bleeding. But after i have watched 1080p videos at youtube with thick blackbars, i barely noticed or never noticed at all. The bleeding increases as the brightness increases. At full blacks it bleeds horribly at max brightness. But even if you watch some movies with black bars. You won't notice them below %50 brightness. I barely noticed at full brightness. And it didn't bothered me at all.
Sound is very good but not great. I am very sensitive about music quality. After windows update it installed nahimic itself. Its okay to have. But i don't think you would miss it. I can compare it with Samsung tab s5e. Which i would rate:Samsung tab s5e: 9.7/10Matebook 14: 7.5/10Matebook sounds very loud, rich sounds but i don't find it better or similar with my tablet.
Gaming I have tested only with 2 games, active cooler was beneath the laptop since my cooler was not blowing that strong. So, I have used it. But be careful though, if the cooler blows intense, it may interfere with laptop's own fans which are positioned bottom of the laptop. I used MSI afterburner,Riva statics to see temperature and performance.Starcraft(non remastered version) to represent wooden pc games. Laptop get a bit warm thats all 40-42 Celsius.Diablo III: If video quality increases the heat goes up. I run Diablo III at 1080p with max settings fps was around 27-32. temp was around 68-70.At more than 1080p ,with same settings, 22 fps. temp was around 78-80.At 720p, all low settings, 38-45 fps. Temp was around 52-53. Which i played like 3-5 hours with same temp. But fps dropped to 34-36 and stayed around same. The numbers may be a bit different because of my corrupted sectors at my RAM (brain).
Battery is very good. For 2160x1660 res it has a lot of juice to run through day.Today I have used my laptop for more than 6 hours, at that time i have watched youtube videos(%25), Browsing net(%65), printed some papers(%5), downloaded some programs(%5). Especially if i download something, the laptop becomes +4 Celsius hotter. And battery icon tells me half of time usually it will last while downloading stuff at 3.3Mb/per sec. Downloading drains more battery then watching youtube videos in my scenario. Watching youtube increases temp +1-2 Celsius. Also when i attach the laptop my 1080p Monitor(Asus VG1249Q) battery icon tells me battery will last longer than usual(about %20 longer). Probably resolution factor. I just realised charging battery araound 20-70 seems a better idea for battery life. I do charge between 30-70. Every 6 weeks or so i will completely drain battery and charge it to 100. There's an option in PC Manager app to do that.
Overall Temps are okay. Today i have used laptop for 6-6.5 hours and probably it will last 1.5 hour more until fully drained. From the morning its temp was same 32-34 and right now its 36-37. Its not uncomfortable but not comfortable either. Btw i live in a room at 16-19 Celsius. To overcome that problem i opened radeon software and choosed custom from the GPU sub menu and clicked radeon chill and choosed 60 fps for min and max-i know its not related for daily use but i have superstitions :)- the heat dropped like 4-5 Celsius but at costs of less battery run time. So its your choice.
Material is top notch. The matte aluminium like(not sure about its component) material used exterior part, its helps a lot when distributing heat. Also exterior cools quickly. The fingerprint reader reacts well enough. The keyboard light is good to have.
Another edit: An audio app called nahimic seems to causing some battery drain and HIGH CPU usage problem. I have uninstalled it now the temps are ok. Also battery is better. Temps are: Around 55 for DOTA 2, 50 for fallout 3. İt seems decent. (Winter here.. And Room temps about 18)
Couple of edits: PC Manager App, keyboard. Gaming.
submitted by Tutimucizeguyem to AMDLaptops [link] [comments]

Can you guide me to the laptop I need?

Hello Thank you in advance.
I’ve got a 10 year old Apple MacBook Air that is starting to crap the bed a bit.
I don’t play video games. At all.
My needs are simple.
  1. I can easily download pictures and videos from my iPhone.
  2. I can use iMovie to make/edit small videos for YouTube.
  3. Must be able to use Microsoft word and publisher.
  4. Watch movies from Netflix, Amazon prime, YouTube, etc.
  5. ITunes.
I realize these might be old and there might be better systems and programs. These are just the things I’ve used for a decade and am most comfortable with.
I don’t need a $10,000 laptop that Tarantino is using to make movies. I just need something to make silly ten minutes YouTube videos.
Any suggestions? My buddy suggested the new MacBook Air with M1.
submitted by FoxBeach to laptops [link] [comments]

Asus ROG Zephyrus G14 Complete Buying & Maintenance Guide | All You Need To Know

Asus ROG Zephyrus G14 Complete Buying & Maintenance Guide | All You Need To Know

Note: this post is not yet finished; some parts are still under construction. I’m new and still learning reddit text editing/formatting in markup mode, and some of the write-ups are taking a long time to finish. So in the meantime, please peruse the current contents of this guide, in the hopes that it may be of help to you. We’ll finish and keep this guide updated together as we go along. Enjoy!

Asus ROG Zephyrus G14 (GA401)

PROS

CONS

(please check if some of these are deal-breakers for you)

Asus ROG Zephyrus G14 VERSUS other laptops

Compared to other options, this is quite a step up from the TUF A15, for example. You get a better build quality and a rigid form factor. You definitely won't get the power of a thick and heavy, 15-inch, 144Hz laptop (Nitro 5, Dell G5/G7, Helios 300, etc.) of around the same price, but you will get better battery and portability. You also need to have an external camera as there is none built in.
G14 VERSUS Asus G15, Asus A15, Dell M15, Razer Stealth, etc.

VARIANTS AND CONFIGURATIONS

The G14 variants are subject to the availability that varies per region. Please do check with your local dealers and suppliers beforehand for stocks and availability.
Refer to the Asus Zephyrus G14 website for the complete tech specifications sheet of each configuration.
External
Internal
UNDER-CONSTRUCTION;CONVERTING-TO-COMPREHENSIVE-TABLE
Variant CPU GPU RAM SSD Display Additional Notes
GA401IH Ryzen 5 4600HS GTX 1650 8GB 512GB 1080p/60Hz, 1080p/120Hz Here are some of my thoughts on the G14 base model.
GA401II Ryzen 7 4800HS GTX 1650 8GB 512GB 1080p/120Hz, 1440p/60Hz ?
GA401IU Ryzen 7 4800HS GTX 1660ti 16GB 1TB 1080p/120Hz, 1440p/60Hz ?
GA401IV Ryzen 7 4800HS RTX 2060 16GB 1TB 1080p/120Hz, 1440p/60Hz ?
GA401IU Ryzen 9 4900HS GTX 1660ti 16GB 1TB 1080p/120Hz, 1440p/60Hz ?
GA401IV Ryzen 9 4900HS RTX 2060 16GB 1TB 1080p/120Hz, 1440p/60Hz Tweaktown Review, Tom's Guide Review, Deccan Herald Review, CNET Review
GA401IVC Acronym Ryzen 9 4900HS RTX 2060 32GB 1TB 1440p/60Hz ACRNM
GA401?? Ryzen X 4X00HS GTX ?GB template template template

INITIAL CHECK-UP/INSPECTION

GENERAL
HARDWARE
SOFTWARE

INITIAL SETUP | WHAT TO DO AFTER BUYING

  1. Join ZephyrusG14 , check pinned posts and keep yourself updated
1.Update software/drivers
2.Uninstall junk/bloatware
3.Install the following applications/programs

WARNING

UPGRADES

Along with my EDC that looks great with my Eclipse Grey G14, here are some of the peripherals/accessories I personally use for both travel and workstation scenarios: - On-The-Go | Light Travel Setup - Mark Ryden SQUERO 15" Waterproof Laptop Backpack - Baseus 65W 3-Port Mini Quick Travel Charger - Logitech Pebble M350 Bluetooth Mouse - Anker Soundcore Liberty Air 2 True-Wireless Earbuds - For long-distance travels: - PINENG 30,000mah PD Power Bank, TRN Balanced Armature 5 Hi-Fi In-Ear Monitors, OneOdio Studio Pro DJ Monitor Headphones - At-Home | Workstation/Gaming Setup - Logitech G502 Proteus Spectrum - Mechanical gaming keyboard - Logitech G35 7.1 Surround Gaming Headset - Dual Shock 4 Controller - Cooling Pad

ADDITIONAL RESOURCES

Good Reads - Asus ROG Zephyrus G14 GA401 Official Website - G14 REVIEW: AMD HAS REWRITTEN THE RULES; Asus and AMD have put Intel on notice by The Verge - All About AMD: Welcome "Renoir" (Ryzen 4000 Zen 2 Mobile Processors) by PCMAG - Ryzen 9 4900HS debuts on the G14 by notebookcheck.net - Asus Zephyrus G14 revisited by ultrabookreview.com
YouTube Videos - Official G14 Reveal Video - Reviews by Jarrod's Tech, Dave Lee, Matthew Moniz

About me: I'm u/Darvelus, owner of a G14 base model (Ryzen 5/GTX 1650) since its release. The above guide is based on my experience with the machine, and of course, scouring the subreddit and the internet to know everything I can about my first big tech purchase. I’m very satisfied with the G14 as my daily driver. I sincerely hope this guide has helped you in any way.

Note: Please help me keep this guide up-to-date by sending your comments/suggestions/corrections down below! I'd love to keep this updated so that others can have an easier experience with their G14 journey. Thank you!

submitted by Darvelus to ZephyrusG14 [link] [comments]

[Launch] SCHENKER VISION 15 with Intel for 2021

[Launch] SCHENKER VISION 15 with Intel for 2021
Dear Community,
today we announced SCHENKER VISION 15, a brand-new laptop that is the second project in our collaboration with the SPG division in Intel.
A new vision

Press Links:


Reviews:

Notebookcheck
Golem
to be continued...

I already took some teaser pictures out here in Taipei, Taiwan
Mainboard + Thermals with Core i7-1165G7
All real-life pictures:

Product Highlights

  • Based on Intel NUC M15: the new industry-standard reference implementation of the Tiger Lake and Project Athena platform
  • Intel Core i7-1165G7 with 4 Cores and 8 Threads in 28W and a top-notch cooling system
  • Over 90% sRGB Coverage, over 400 nits Brightness + Multi-Touch
  • 180° opening angle and one-hand opening
  • High-quality membrane keyboard
  • Full metal unibody with anodised aluminium, i.e. no plastics, no paint, no magnesium -
  • Extremely quiet even under full load
  • Extremely cool in the keyboard area under full load
  • Performance Profiles without 3rd party app, fully realised with the Windows battery slider
  • Super-efficient LPDDR4X 4266MHz RAM, PCIe Gen4 x4 SSD Support with Samsung 980 Pro
  • Stable DPC Latency
  • 2x Thunderbolt 4 and Intel Xe Graphics thanks to Tiger Lake
  • Thunderbolt 4 and USB-C Charging on both sides of the chassis
  • Available in 25 keyboard layouts on Bestware.com
  • Excellent laptop speakers out-of-the-box without 3rd party apps
  • Long cable USB-C Charger already included in the package
  • Long-term support thanks to Intel partnership
  • Expected to be fully compatible with Thunderbolt docking stations and eGPUs
  • Smart features such as Voice Assistant even when closed, Wake on Approach, Walk Away Lock, No Lock on Presence

Benchmarks

A full review embargo will be lifted on December 4, 2020. But we can already share some of our internal testing with the i7-1165G7 in our samples:
  • Cinebench R20
    • Single: 590
    • Multi: 2433
  • Cinebench R23
    • Single: 1537
    • Multi: 5990
  • 3dMark TimeSpy: 1617 Graphics Score
  • Prime95 Small FFT at 'Better Performance' profile:
    • 77°C CPU Temperature, sustained
    • CPU Clock: 3.1GHz, sustained
    • Power Consumption (wall socket): ~40W
    • 36.5db(A) fan noise in low frequency ranges
    • Palm Rest: 28°C
    • Keyboard: 31°C
    • Bottom Case: 44°C
  • Battery Life:
    • Charge from 20% to 90% in 90 Minutes
    • Up to 14 hours Wi-Fi web-browsing at 150 nits with a full charge
If you have questions for other specific benchmarks, please let us know in the comments below.

Our role in the 'WITH INTEL' partnership

We have already demonstrated with XMG FUSION 15 that we can be an equal partner to a giant such as Intel. Intel's reference design for the QC71 gaming laptop was sold world-wide through various brands, but no partner like XMG brought so much support and after-sales firmware fine-tuning to the table.
Over the lifetime of XMG FUSION 15 we were always first to release countless BIOS updates, including the major feature upgrade in June 2020 which delivered highly sought-after premium updates to all existing customer, based on the feedback we have received from our community over the months. Up until now, XMG FUSION 15 is still one of our gaming and content creation bestsellers and we plan to continue to support it well into 2021 because it is still a unique combination of form factor, battery life and performance that is still absolutely unparalleled in this industry.
With SCHENKER VISION 15, we plan to repeat this success. We have had countless meetings with Intel since the launch of FUSION 15 to help them shape their roadmap, to deliver customer feedback and help fine-tune the hardware and software solutions of the next generation. We have started testing samples of SCHENKER VISION 15 already months ago and have been instrumental in helping Intel to test their product outside of the lab in a real-life environment, with diverse usage scenarios and with a keen eye on fine-tuning and the quality of user experience. Already in our hands this product has gone through numerous firmware updates and we are looking forward to further fine-tune it ahead of the first consumer-ready shipment in January 2021.

Current List of Feature Requests

We already have a list of talking points with Intel to make sure that this product will go the last mile and reach 100% perfection. For the sake of tracking our progress with Intel, we will share some of the items here.
[Last Update: November 19, 2020]
Item Status
Include Fn+Ctrl = 'Context Menu' Key Confirmed, will be implemented
Tool to create custom user BIOS boot logo Mostly confirmed
Improvements on Fan Control at low temperatures tba
Modify or Disable Keyboard Backlight Sleep Timer tba
Ability to disable Fn Numpad in BIOS tba
Ability to disable Touchpad Toggle (top-left) tba
Some of these requests mirror the things we have done on XMG FUSION 15 over time. For VISION 15, we hope to be able to deliver them faster.
If you have a feature request, no matter how unique or obvious it is, please let us know in the comments below.

Pre-Order, SKUs and Shipping Schedule

We will open pre-orders on December 4, 2020. The first shipments should become available in the middle of January 2021 according to Intel's current build-plan. We will go for i7-1165G7 and 16GB only - there won't be any Core i5 or 8GB configuration from us.
We commited to All Silver, and All with Touch display. A 'Midnight Black' edition might be planned for later in Q1 2021.

FAQ - Frequently Asked Questions


Q: How does Intel Tiger Lake keep up with AMD's and Apple's latest CPU offerings?
A: Not too bad, I would say. The i7-1165G7 in VISION 15 beats AMD's 4000U series in Single Core performance and reaches ~75% of the result of the AMD Ryzen 7 4800U in Cinebench R20 Multi, despite having only half as many cores.
In Cinebench R23 Multi 10min Loop, the i7-1165G7 reaches ~77% of the result of Apple's M1 in the actively cooled MacBook Pro M1 - although it remains to be seen how these two competitors fare in the skin temperature vs. fan noise department. With a 15.6" housing and the very efficient cooling solution, we expect VISION 15 to be ahead of the 13" MacBook Pro in that regard.
But a CPU is not only about raw multi-core performance - it is also about the platform it is enabling. There is much to be said about the list of little-known features that Intel supports in Tiger Lake, but the key difference makers are PCI-Express 4.0 for outstanding SSD speed and Thunderbolt 4 for unparalleled connectivity. AMD still has no answer for Thunderbolt (and no USB 4.0 implementation either) and Apple's USB 4.0 implementation on the M1 silicon is reported to lack support for eGPUs.
Other notable features in Tiger Lake compared to Intel's predecessors include:
  • Support for 4 displays in total (Intel HD Graphics only supported 3)
  • Native HDMI 2.0 and DisplayPort 1.4a
  • QuickSync with AV1 hardware-decode up to 8K HDR
  • Intel Gaussian & Neural Accelerator (Intel GNA) for AI acceleration
By the way, if you think Thunderbolt 4 is just a re-brand of Thunderbolt 3 - check out this video of TechQuickie by LinusTechTips.

Q: How does the cooling and audio system of VISION 15 work exactly?
A: It's pretty simple. As you can see in the picture above, the it has a dual-fan cooling solution with one heatpipe for each fan. Now the base unit of the laptop has a mesh above the keyboard. This serves a double function as a kind of speaker mesh and air inlet. Despite the speakers being on the outer bottom edges of the chassis, the unibody is built in a way that it transports the sound very well to that mesh so that the sound feels like it's coming from the screen. The result is a very crisp audio volume - without having to fiddle with any 3rd party audio enhancement apps.

Q: How does the system behave while on-battery?
A: It is part of Intel's Project Evo design guidelines that a system must not throttle too much when running on battery. In VISION 15 this is implemented perfectly, where neither CPU nor iGPU are performing below 95% of the 'plugged-in' performance on a full battery. This number might slightly go down when you battery approaches the lower levels, but you'll be able to work on VISION 15 with full power for a pretty long time overall. Your system will be fully responsive - no matter if plugged-in or on battery.

Q: Does VISION 15 support display output with VESA Adaptive Sync?
A: After already having pioneered NVIDIA G-SYNC in XMG PRO and ULTRA series in recent years, supporting the more open Adaptive Sync standard in as many models as possible is one of our design goals for 2021.
Intel Tiger Lake is the first mainstream CPU platform in which Intel’s iGPU is supposed to support Adaptive Sync according to Intel’s marketing claims.
We tested it on our upcoming XMG CORE 14 with i7-1165G7 and GTX 1650. VRR ouput worked fine*** via Thunderbolt 4 (aka USB-C/DP) with external monitors as long as those are marketed either with “AMD FreeSync” or with “G-SYNC compatile”. Essentially, those are monitors that implement the VESA Adaptive Sync industry standard. However, monitors that implement the proprietary method of NVIDIA G-SYNC (with the built-in G-SYNC module) are not able to support VRR from the iGPU in our tests.
* Curiously, with current Windows and driver, VRR only worked for content that was rendered on the NVIDIA dGPU even though it is indeed displayed through the iGPU framebuffer.

This chart is just FYI, based on XMG CORE 14. Please note that VISION 15 does not have a dGPU and is not planned to be upgraded with a dGPU either.
Now, VISION 15 does not have an NVIDIA dGPU so we need to get it to work on iGPU instead.
According to our contacts in Intel, the lack of VRR on iGPU is due to missing Windows Update. According to Intel, everything already works fine on the “Windows 10 Iron” Build 21H1 19624.200504 (or later) which is currently only available for members of the Windows 10 Insider program. We will try to find out when this patch is supposed to hit the mainstream Windows Update cycles.
However, we don’t assume that proprietary NVIDIA G-SYNC panels will be able to be supported – same like AMD desktop graphics cards can’t support AMD FreeSync on those proprietary NVIDIA G-SYNC panels.
Meanwhile, VRR is called ‘Adaptive Sync’ in the Intel Graphics Command Center and is enabled by default.

Q: Will it be possible to apply Undervolting (Voltage Offset) on the CPU or iGPU?
A: We currently see no avenue to apply undervolting on any of the Tiger Lake systems we are launching. We have discussed the matter with the author of a popular free tool for Undervolting and tried a few unreleased Beta-versions and we are pretty confident in saying that Voltage Offset was still possible in Ice Lake but is currently hard-disabled in hardware for Tiger Lake. We have discussed this with our contacts in Intel and we'll make sure to send any feedback and concerns upstream to Intel.
Apart from Plundervolt which is discussed in this thread, another reason to disable Voltage Offset might be the relocation of the voltage regulators (CPU FIVR, PCH FIVR) in the new architecture. We have reasons to believe that this have has strongly reduced the gains you can make with voltage offset. In other words: the system might already be working at peak efficiency by default. This might warrant further discussion in the future, perhaps when Intel is launching their Tiger Lake "H" series designs (still under NDA). If you have any unique insight on this topic, feel free to share your information in the comments below or via PM.
What we can confirm is that the lowest Idle power consumption of TGL-UP3 has been greatly reduced compared to previous generations. It's now between 3 and 6W in all of the Tiger Lake systems we're currently launching where their Comet Lake (CML) predecessors were usually more like in the 10W ballpark.

Q: Does the Touch Screen support pen/stylus input?
A: It's a capacitive touch screen, so any capacitive pen should work. Pressure detection should be compatible with Bluetooth-connected pens that support Windows Ink, but we haven't tested this yet. Coming soon.

Q: Does the screen rotate when opened by 180°? Does it support tablet mode?
A: The image on the screen does not automatically rotate. However, you can enable convenient keyboards shortcuts (Ctrl+Alt+Arrow Keys) in Intel Graphics Command Center (screenshot).
Tablet Mode can be manually activated in Windows settings. This modifies task-switching behavior and adds an icon for the touch keyboard to the systray. However, even in Tablet Mode the screen will not automatically rotate. Due to the fact that this laptop only has a 180° hinge (not 360°), it is not meant to be used as a hand-held tablet or convertible.

Q: Any plans for a 4K resolution screen?
A: No plans right now. SCHENKER VISION 15 is fixed to deliver a very specific Full-HD Touch Displays with very high brightness and low weight and power consumption. There will be only that one panel - so there isn't going to be a panel lottery like with other brands. We do understand the demand for higher resolutions and we are seeking to be part of the push for high-DPI panels in appropriate form factors in the future. When we picked the new family name for this product, we thought of VISION to be not only our most forward-looking series in the SCHENKER portfolio but also to include products that have stunning visuals. I can already say that 2021 is going to be an exciting year on that front.

Q: How repair-friendly is VISION 15?
A: VISION 15 has seven (7) Torx T6 screws in the bottom cover. The choice for T6 is final as it provides more torque, leading to stronger build-quality without risking wear and tear (i.e. stripped screwheads). We are currently debating whether we should include a free Torx T6 screwdriver with every sold VISION 15. But you can also get them in any PC-centric screwdriver set.
Once you remove the 7 screws, the bottom lid opens very easily. You have access to the Battery, Wi-Fi module and the PCI-Express SSD. At that point you can also clean the fans and heatsinks with canned air.
Removing the mainboard will be very easy as well, requiring you to remove the SSD, Wi-Fi module, 5 ribbon cables and 7 additional screws. The mainboard comes out with the fans and heatsinks still attached as one solid unit.
The LPDDR4x memory is famously soldered on-board. That's why we opt to not offer any configurations below 16GB at launch. LPDDR4x is highly efficient and only available in BGA form. The efficiency helps with thermals and battery life, so it's actually a worthwhile trade-off.
Our cooperation with Intel on XMG FUSION 15 has convinced us that Intel is extremely efficient in their internal RMA handling. We will be able to buy spare parts directly from Intel's factory, but every barebone (minus SSD) will be able to be replaced in Intel's logistics hubs in Europe with zero turnover (Advanced Warranty Replacement, AWR) and very little communication overhead. Having such a strong partner in the back makes it even easier for us to offer our 48h warranty service to end-customers.

Q: How is the keyboard layout going to look like? What function keys do you have?
A: An overview over the German keyboard can be found here. We will offer 25 different keyboard layout at launch - a list can be found here.
VISION 15 continues our recent tradition to offer FnLock - so you can chose between having Fn functions or F keys as the primary input for the top row. Intel also implemented PageUp/Down/Home/End functions on the cursor keys which is something we have done on many of our recent models as well.
Full list:
Key Fn Function
Esc FnLock
F1 Standby
F2 Audio Mute
F3 Audio Down
F4 Audio Up
F5 Microphone Mute
F6 -
F7 Touchpad on/off
F8 Keyboard Backlight
F9 LCD Brightness down
F10 LCD Brightness up
F11 External Screen
F12 Flight Mode
Arrow ↑ Page Up
Arrow ↓ Page Down
Arrow ← Home
Arrow → End
One item is still missing on the pictures: Fn+Ctrl will be 'Context Menu' key. This should come in one of the next firmware updates and the icon will be printed on the keyboard if we get the firmware early enough.

We are looking forward to your feedback!
// Tom
submitted by XMG_gg to XMG_gg [link] [comments]

MSI Prestige 15 A10 review after 1 year of use

MSI Prestige 15 A10 review after 1 year of use
Thumbnail
Thumbnail

Intro

This is my experience with the laptop for 1 year and in this thread, i will be expressing all of my thoughts and will tell you how the laptop has held up so far in almost every aspect. 1 year ago, i wrote a "1 week later review" and i constantly edited it throughout the past year every time i found something new, better or worse. In this re-review, for a lot of stuff i will refer to last year's post, in here i'll mostly explain how it held up and what i did to work-around some (stupid) design decisions that MSI set. To avoid any confusions, i have the base 1399$ config: i7 10710U, 1080P display, 16GB (2x8) RAM on dual channel, GTX1650MQ 35W, 512GB nvme pcie gen3x2 ssd.

General findings, sum-up

Kind of a tl;dr
  • The laptop has actually held up very good, there isn't any major drawback, just stuff that is expected to happen after 1 year if you use your laptop almost every day for ~1 hour to ~10 hours.
  • There isn't any exterior scuff, mark, scratch except on these parts of hinges because they hit the table(s) whenever i lay the laptop on top.
  • Surprisingly, there are minimal oil marks on the keyborad or touchpad, it surprised me how good they held up considering that i don't use a mouse or an external keyboard. Also, there's no oil marks from the palms on the plam-rest area around the trackpad, it is like on day 1.
  • There's 11.5% battery wear reported from hwinfo, more on that in the longer explanation below.
  • The performance is still the same just like in day one, cpu&gpu temps are still great (there's something weird i noticed now, more on that below).
  • There's easily-visible ghosting on the screen that i initially didn't notice.
  • I also have the notorious trackpad issue but it never occurs to me since i always use the laptop in a flat surface or on my lap supported by both legs.
  • Do i still recommend it? Yes, but only if you find it on sale, otherwise it is very hard to recommend this now.

Longer, more detailed review

Exterior concerns

In terms of how i take care of my laptop, i am not "super strict" to think that my whole life depends of a piece of tech but usually i like to take care of my devices, specifically this laptop because i use it almost everyday. I try to clean it at least once every 2 weeks, that is by cleaning the dust off of the display (not pushing it to the sides where the dust gets in the frame), i use standard 70% isopropyl alcohol to clean the whole aluminum (and plastic) parts of the laptop and so far i haven't had any issues with it. When i don't have alcohol, i use those display cleaning sprays for TVs that you can find in your local market (it comes with a micro-fiber cloth which i use to avoid scratching the laptop). Perhaps this info might have looked obsolete but if you hesitate on using 70% alcohol for your aluminum laptop, don't be, just use it since it won't cause any problems.
I also take a hair dryer, set it in the cold option and blast it at full speed under the laptop over the air-intake vents, specifically over the fans to try to blow some stuck debris off of them. I still haven't taken the laptop apart to clean the cooling fins (that are connected to the heatpipes, over the fans) - the exhaust area since i'm suspecting that some dust has started to build up in there. Since we're at the debris part, there are some small dust-pieces that are stuck under the keyboard. I can't remove them but since this isn't an Apple "butterfly" keyboard, there is nothing to worry about. The keyboard still works ok.
Since we're at the exterior, these parts on both hinges have been a bit scratched and as a result, they look like the top-paint has worn off. This happened because when i lay the laptop on a table, i drop the back part of it first, then the front. You will never actually notice them unless you're looking for them. Btw, since we're at the hinges part, they have actually held up great. Tbh i am still concerned that the also notorious hinges issue will appear on my unit but so far so good. The laptop still opens easily (and kinda satisfying, a bit weird to say that) with 1 hand without holding the base and the screen can be closed up to a bit less than 30 degrees until the display closes itself because of the weight. That's pretty good actually.
As of the construction of the laptop, again, just like in my original review, it is better than a plastic laptop but nothing special. In fact, the build quality is just OK. When i hold the laptop with 1 hand, if i put my hand around the back part, over the exhaust area and the fingers over the intake vents (exactly like this), you can easily feel the thin metal bending and touching the top RAM stick (fortunately, it doesn't push the ram stick). That is very uncomfortable and concerning. Oh and if you're experiencing keyboard deck flex (when pushing), keep in mind that these 3 rubber feet in the middle of the bottom-panel might be too short (that's my case) or the back 3 rubber feet might be too tall. I tried to shave off the 3 back rubber feet (near the air exhaust part) but still the other 3 middle feet don't touch the table, specifically the 2 side rounded ones, those are too short. I can keep on going, by slicing a couple of layers off of those 3 taller feet but i don't really mind them now. Also, i haven't seen this "issue" reported by others but my unit cracks a bit when lifting it up or opening the lid. The cracks come from the improperly seated bottom metal panel and the plastic air exhaust area. Also, the reason why the display flexes more than usual (compared to, f.e: and XPS 15 with a glass display) is that the front of Prestige 15's display is plastic, also it's frame is plastic and so, the display as a whole is not very strong. Not to forget mentioning that the aluminum display cover is just a sheet, non-cnc milled.
While typing ( a bit more intense) and sometimes when tapping on the trackpad, i hear and feel the bottom panel rattle a bit. While this is probably an issue only in my unit and that it can probably be fixed by taking the bottom panel apart and re-seating it, this issue has to do with the infamous MSI trackpad issue. Since MSI uses the same battery units across a lot of their laptops (this 82Whr battery specifically), those batteries have extra screw-holes (however you say that) so that they can be applied in different MSI laptops with different bodies. If you search in this sub for "trackpad" and sort by "new", you'll find a lot of posts regarding this issue, specifically for the P15. As you can see here, there's an extra plastic part interfering with that shiny circular part. As a result, when holding the laptop only on one side, because the body of the laptop isn't very rigid, it will start to slowly bend but the battery will try to stay in its place. This will result on the battery literally pushing the trackpad up, in this way you won't be able to physically click on the trackpad. This issue can also be reproduced when you use the laptop on your lap (in most cases). Also, because that plastic part interferes, this also results in another issue: the bottom panel is not fully seated in. I addressed this in my original review, but i'll include some pictures here again. You can see that the panel on the front is for ~1mm not fully clipped-in. As you can see in this thread, if you cut that plastic part and make some way for the shiny silver circular part, the problem fades away. Since i still haven't taken the laptop apart, i haven't tried this and i actually won't do that since to me the problem rarely appears, only when my laptop is sitting on uneven tables. On my lap, it rarely causes any "physical click block".
Something also important to mention is that, because the usb-c ports aren't seated completely in the body (design choice), all usb-c cables will kinda stick out and sag, potentially breaking the ports. To avoid this, specifically for the charger, i keep one of the charger's tie (sry, idk how to say it) like this. This way, sagging is avoided. I have been using it like that since day 1, no issues so far.
Edit: my unit has a tiny bit of coil whine which is audible only when you get up close to the keyboard (~10cm from it) and when both fans are off. With either of the fans spinning, I can't hear it.

Trackpad / Touchpad, Keyboard performance

Since we're at the touchpad, as you can see, after 1 year of use there aren't any very noticeable oil marks, if not at all. On my unit, on the picture you can see that the trackpad isn't properly seated in its place. The left part is ever so slightly deeper in the body that the right part. When clicking on the left part, the click feels and sounds "heavier", and a bit noisier compared to the click on the right side which is more reassuring, a bit quieter and kinda "stable". Don't worry, this is just on my unit but it really doesn't annoy me. Anyway, the trackpad performs great, it is very smooth, very accurate/precise, it registers even the slightest finger movements and it works very well overala. I compared it side to side to one of my colleague's Macbook Pro 13 2016 and the trackpads had almost an identical feel to them, both very smooth and it was a bit hard to tell them apart. Accuracy is also VERY similar. You might think that the mbp's trackpad might be more accurate but check this: on my 1080P 15.6" display, when moving, f.e: the cursor for 1 pixel, the pixels are bigger (and not that many) as as a result is looks like you need to move your finger a lot in order to move the cursor to the next pixel (to be more precise), compared to MBP13's 13.3" 2560x1600 display where because of the higher PPI, moving the cursor looks smoother and more accurate but in reality, it isn't. If someone has the UHD display Prestige 15 and compares this trackpad performance aspect side to side even with a macbook pro 16, the Prestige 15 will look smoother because of its higher PPI display. Btw, the fingerprint scanner sometimes gets in the way. It is very fast on unlocking but i guess that initially i didn't add it correctly because i have to lay my fingers on it in a very specific angle, otherwise it won't register. It registers something like 5/10 times.
Keyboard is actually nice to type on, at least for me. There are almost no oil marks at all (sorry, my phone's camera quality is not good and the lights keep reflecting a bit over the keys but just take my words for granted). The feedback is nothing special but enough to feel it. The key size and spacing for me is ideal. The keys aren't very stable, if you push the very edge of a key, it can go down for quite a lot, almost to the bottom without clicking but fortunately at that point it registers a click. The backlight is great. It isn't very even but keep in mind that the keys are intentionally transparent on the sides so that the sides will illuminate the rest of the keyboard and as a result, i rarely use lvl2 or lvl3 brightness. What i still dislike is some of the layout. I am used to it but still, some keys are badly placed. The delete key's position is fine to me, i got used to it very quickly and i use it very often. In all instances, keyboards are subjective, totally a personal preference. Probably you don't like it but i do.

Display

As i said in the tldr above, there's ghosting. I initially didn't notice it since i come from another display which also happens to have a lot of ghosting but i've watched a lot of monitodisplay reviews and i saw how displays with low, almost nonexistent ghosting behave and this 1080p display really has ghosting. When moving the cursor fast, or scrolling a bit faster than usual, you can notice it. Also, it's response times are slow (one of the reasons for ghosting) but not an issue in day to day use unless you're gaming. Something very important (and annoying) is color banding, which is very noticeable in almost every image which has (deep?) gradients. Black to gray (also other colors) images are displayed with a lot of banding. For other info about the display, check my original review.

Battery Life, battery unit's degradation

As stated above, battery got a 11.5% wear over the course of 1 year. For the past ~10 months, i've been using the 80% charge limiter for pretty much all the time. There are a few (maybe 2-4) exceptions where i charged it to 100%. Now, i am very sure that battery wear didn't start until i started to use the 80% limiter. I am suspecting that i should've been using the 60% limiter all the time but because i needed to use the laptop on battery sometimes for a couple of hours, i had to use the 80% one instead of 60%. For those past ~10 months, i haven't really taken the laptop out with me besides class, library, etc. Even during quarantine, i've been using the laptop on charger, as expected. I am totally fine with this 11.5% wear since all li-ion batteries degrade over time, some sooner, some later, some quicker, some slower. I am a bit surprised because just ~3-4 months ago, i had only ~7% wear but i guess from now on i have to use the 60% limiter whenever i can.
Anyway, even with that 11.5% wear, the laptop still charger to 100% (but only at around 70Whr instead of 82Whr) but i am still getting awesome batterylife. A couple of days ago i did another battery test, from 100% to 0% until it turned off. I got around 11hours of batterylife. I could've gotten right at around 12 hours but that stupid sppsvc.exe service kept running on the backroung every couple of minutes and made the cpu draw ~15W for 1 second and as a result, the batterylife was a bit shorter. Before continuing with the way how i did the battery test, if you have that problem with high cpu usage with that sppsvc.exe service, just download Process Hacker, search for that process and suspend it. Make the app open on startup and you'll never have to worry about unexpected high cpu usage ever again.
How i tested the batterylife this time: it was a very light usage, i set the brightness at 150 nits (50%, assuming the unit i have reaches 300 nits at 100%) because i am in my apartment and during the day the room isn't super bright so the matte display does a great job at dealing with glare and reflections and as a result, i don't need anything over 150 nits in this case. Keyboard backlight off, wi-fi on, no battery savers of any kind - windows battery setting on taskbar set to best performance and in Creator Center i always use only High Performance but keep in mind that MSI automatically lowers the CPU's PL1 and PL2(i guess) to 15W, which is essentially the "Balanced Mode" - some kind of battery saver. I have addressed this in my original review. I have also been using a -110mv undervolt (on core and cache, as it should be) for the past year. Apps minimized on background (that don't really need any cpu resources - around 1% all in total, only need RAM) are: taskbarX (former falconX), throttlestop, mailspring, MSI Creator Center, MSI True Color, Nvidia control panel thing. Btw, from last year, i remember that undervolting didn't help much with batterylife, at least in this light usage test.
What apps i was using: brave browser with 5 active tabs, 3 of which were just static landing pages that i wasn't using, 2 other tabs were 2 books, one of which i was using (~700 pages Head First Javascript pdf). I also (rarely) browsed other sites whenever i was stuck on a coding problem . I also had Webstorm opened which doesn't really use any resources, it is very light overall. During the test, the total power draw of the laptop was sitting at around 5-6W, just the cpu was hovering between 0.5W to 2W, mostly at 0.9W which is amazing. This is without batterysavers, only the lowered PowerLimits set by MSI to 15W (can't override this without causing some issues). Whenever i was testing some code, a new page was loaded (or the same page was refreshed) for <1second. This makes the CPU's power draw spike a bit. As of what else i was doing besides coding and reading a pdf, i watched some videos on youtube at 720p at 1080 for ~30 minutes in total. Now, when i was at 40% and ~20% left, i noticed some longer, higher cpu-usage that caused the battery to drain faster. It was that retarded sppsvc.exe process and windows doing, well, retarded windows stuff-partially updating on the background, or i guess it was just checking for updates, idk i forgot. Without these 2 issues, i would've easily gotten 12 hours of use. Check that picture above again, i hibernated 3 times, 1 of which was a ~11 hour hibernation which drained 2% on the process. Overall, batterylife is still great. Besides this manual calibration, i also did the built-in MSI calibration tool for the firs time and as you can see from the same picture (near the bottom), it didn't help much. Keep in mind that this is a moderately light usage but without any battery saver.
Since were at the battery section, i still use hibernation instead of sleep. It uses s3 sleep but it drains power like modern standby sleep in XPSes. Also, even when the charger is connected, hibernation, sleep, and even shut down will drain power from the battery, instead of the charger. Hibernation drains around 3% for 24h. Check my original review for other info.
One more VERY interesting thing i found out. I made this comparison of the battery units of the GS66 (the 99.99Whr battery) and Prestige 15(82Whr battery). I am 99% sure that the 99.99whr battery not only will fit, but even work properly inside the Prestige 15. Perhaps i may be wrong but check that picture, the specs are the same besides the battery capacity (of course, that's the point). I can just imagine how the Prestige 15's batterylife would've been with a ~25% battery like that 99.99Whr one. I asked MSI Support and they told me that they provide users with batteries that the laptops originally came with. In my case, if i ever request a new battery, they would give me only a new 82Whr one, they said that they didn't test the 99.99whr battery in the Prestige 15 yet. I am thinking of doing a workaround for this, perhaps i can get the original (not 3rd party) 99.99whr somehow directly from MSI.

Performance of the CPU and SSD, fan noise, fan behavior improvements.

First, i have never used the GTX 1650MQ for gaming or work, only for stress/stability tests. If there was a config only with the i-gpu, i would've gotten that instead since i don't have any uses of the dedicated gpu at the moment. I can confirm that the gpu never throttles, on charger nor on battery - always draws 35W when used at 100%. Also, to try to cut this review short, there still hasn't been any bios or ec update for this Prestige 15 A10 in almost 1 year, so no improvements whatsoever. Not even an option to choose to prioritize CPU's power draw on battery (switch to 65W&45W) compared to GPU's priority (35W, perhaps to lower it to 15W or so), at least 1 option. Nothing at all. The battery can output around 65W (or even 60W) in total, that's why one of the components should get less power, in this case MSI chose the cpu. MSI's customer support is awful, straight up sucks. Sorry for anyone who has had good experiences but their online chat is not good. I have to wait 1 to 3 days for 1 reply.
Anyway, i have addressed the next issues in my original review, too, so i'll try to cut short. All you have to do is use this May update of Creator Center (the newer Microsoft Store version doesn't apply the fancurve settings when woken from sleep./hibernation. Perhaps it has been fixed now), switch to High Performance, switch to Advanced Fans and apply these fan settings for the CPU fan and the GPU fan. Don't worry about the very low values, MSI will overwrite those when a particular cpu/gpu temp is reached. What you should concentrate on is the first dots(s) from the left, in each tab. All you have to do is edit your default facurve of the gpu fan and set the first dot to 0%. This means that the gpu fan (that makes more noise) will remain off until the cpu reaches 65°C (or 66, not sure). Now, even though the cpu fan spins a lot (2500rpm at 30°C, ~3200rpm at 65°C), it makes less noise because it is smaller. Because during idle and light tasks the cpu doesn't pull more than 5 watts (depending on what you do), and those 5watts don't generate much heat, the cooling solution is able to handle that heat with ease and the temps will still be the same, just like if the GPU fan would've been spinning, too. You can also set the cpu fan's first dot (from the left) to 0% to turn it off but its starting point is lower than gpu fan's, that cpu fan will turn on when the cpu reaches ~51°C and it will immediately spin at 2900rpm or so. Because with both fans off, during idle and light tasks that heat from 5W just circulates inside the heatpipes, the cpu temps will raise and chances are that it will settle at around 50°C. This will make the cpu fan often turn off and on, every 10 seconds or so which is VERY annoying. Since it produces less noise, i'd recommend you to leave that cpu fan on (apply those fan curve settings like here) and you'll never be annoyed by the fans, even though that smaller cpu fan spins all the time now. In my room, which is very quiet, i can slightly hear the fan from ~60cm distance but it doesn't annoy me at all, i can easily ignore it. With this improvement, you'll never have to switch performance profiles again (in most cases) because you'd better get the full performance while you're on charger but on battery, it will automatically switch to "Balanced mode" even though it tells you that you're on "High Performance" mode. So, you're on battery saver when using the laptop on battery anyway. The laptop still stays pretty cool to the touch, even at light tasks the top part of the exhaust area is slightly warm. On load, the metal around the top number row and over it get uncomfortably warm.
As of the cpu performance, it is still the same, with or without undervolt. Undervolting helps more on higher cpu loads but for that info, check my original review. In CBr20 it still performs great, first run usually at over 2900 points, then the second at around 2800 and it will settle at mid 2700s for every run after that. It pulls around 48W to keep 3.9Ghz on all 6 cores with that undervolt, then it settles at around 42W indefinitely, at around 3.7 Ghz or so. Single core results are weird since i can't seem to understand whether MSI set those limits or Intel (probably Intel). Only for 1-2 seconds it keeps 4.7Ghz on 1 core at around 19W, then it hovers constantly between 4.0Ghz at ~13W and 4.6Ghz at 18W or something. It sometimes scores around 480, sometimes 450. During the single core CBR20 test, fortunately the fans' speeds stay at around 3200rpm and 2900rpm, for temps of around 65C and 88C (because of the low fan speed). The cpu temps are still identical to 1 year ago but there's something that i discovered yesterday. The temps that i'm getting are pretty much the same temps that Jarrod from Jarrod's Tech on youtube got in his MSI Prestige 15 review(s) but,only 1 core gets hotter compared to others and reaches high temps faster. As you can see here, after a couple of CBR20 runs, the temps stabilize at mid 80s but only core2 stays at ~95°C. Now, because all of these software that monitor temps and set fanspeeds work only by getting the highest temp from 1 of the cores (not the average temp of all cores), as a result, it displays that all cores are at 95°C, which isn't true. Because of that, every software displays that CPU package temp is 95C. Now, as soon as that core passes ~90C, the fans quickly start spinning from ~5K rpm to 6K rpm, just because of that core, which shouldn't be the case. From what i remember, from videos of derbauer (der8aur) and Steve from Gamer's Nexus and Linus, i'm suspecting that either the heatsink isn't applied correctly or that the thermal paste is not spread correctly. This is usually the case when one of the cores reaches higher temps than the rest. I have yet to open the laptop but i am sure that a repaste (and heat-sink re-seat) will fix this issue. Otherwise, the performance is still great, considering this aged Intel architecture. Btw, there's still battery drain when using both the cpu and gpu at 100%. The power draw is capped at ~87W and it still drains the battery, similar to how XPS 15s used to behave and how the macbook pro 16 (and 15s) does. Not good, but i don't care about the d-gpu so this doesn't affect me. It is worth mentioning tho.
As of the SSD performance, i intentionally wanted to talk about this since i was kinda disappointed. MSI confirmed that both m.2 2280 slots are nvme pcie gen3x2 (one of which is also a sata3 slot). This is not good since the gen3x2 ssd that is installed performs only at (year) 2015 speeds (perhaps even prior to that). I understand how MSI cut a lot of stuff to bring the price of this laptop down but in 2019, this was totally unacceptable (imo). This has to do with the way how pci-e lanes are configured inside the i7 10710U. Even though it has 16 lanes, they're configured differently compared to the 16 lanes of a, f.e: i7 9750H. If MSI would've gone only with 1 m.2 2280 nvme pcie gen3x4 slot instead of 2 gen3x2 m.2 slots, that would've been better (at least for me). Sure you can get another gen3x2 drive and put them in raid 0 to get double the speed but that is not convenient. There are instances in my day to day tasks that would take benefit of a gen3x4 drive since i can sometimes tell when the ssd is being a bottleneck.
There has been only 1 instance (~7 months ago) where everything just slowed down, wi-fi and bluetooth weren't working, etc. I went to register.msi.com to ask for this issue but a window appeared with the most common issues with MSI laptops and this was one of them. I followed the steps and everything went back to working properly. The only BSOD (besides when undervolting) that has appeared is last year in Android Studio's emulator, because of an update of intel haxm. This was fixed in the next update and i never had any other BSODs, system slowdowns, etc. That issue wasn't laptop's fault anyway, besides those cases, it is very stable.
Edit: In the fancurve editor, I set all points to 0% on the cpu tab, besides the last one which is set to 10%, same for the gpu tab. This makes the cpu fan stop spinning unless the cpu has reached 95C. During this time, the gpu fan again turns on at around 65C cpu temp but it stays at ~2900rpm until it reaches 95C and also until it stays at 95C for around 10 seconds. This means that if you have a short, high cpu usage, only the gpu fan will turn on but it won't start ramping up for around 10-15 seconds, so it will stay at a constant speed. After those ~15 seconds of continuous high cpu use, both fans start spinning and for around 5-7 seconds they reach 6K rpm. During this time, the cpu's power draw is pretty much identical as before. This varies between different ambient temps, table surface, thermal paste, heatsink seating, etc. To me, besides the high minimum fan speeds, this is the optimal fan-behavior. Sure, the body temps during idle raise a bit, so do the idle cpu temps (with a couple background apps and a lot of tabs, right now the cpu is at 49-52C, ~2.3W power draw). The fans are off at the moment, completely silent. At this point, I just wish that the fans' minimum speed was at around 1300rpm and that the gpu fan's turn-on point would've been something like 80C.

Miscellaneous

I highly encourage you to read the comment section of the Prestige 15 A10 review by notebookcheck. Although that review is a bit misleading and their pointing (scoring) system sucks, the comments section provides some good insight for Prestige 15's audio latency issues and the display panel lottery. Some comments are cringe, some are trustworthy. Just keep in mind that every single laptop in the existence of consumer tech, has their kind of issues, defects (hardware wise).

Conclusion, some talk about the Prestige 15 A10 and A11 (2020), Summit E15 and B15.

The Prestige 15 is still unique, there isn't anything that is similar to this. Sure, there's that Asus Zenbook 15 but that one has a numberpad and it doesn't come with the 6 core 10710u, it's battery is smaller, etc. There's also a Lenovo Ideapad (if i'm not mistaken) that is similar but it also has a numberpad. I don't want a laptop that makes my arms shift to the left when typing. The Prestige 15's almost centered keyboard was one of the purchasing factors. There's also the new Prestige 15 A11 and the new Summit Line. To not make this thread any longer, i will just link you a comment i wrote ~2 months ago about those new Laptops. The new Prestige 15 is a downgrade in multicore cpu performance but it has better single core performance and the pci-e lanes are configured better. Read that comment for the full insight. Do i still recommend the Prestige 15 A10? I am not saying this because it is my laptop, but i actually recommend this over the new A11, but ONLY if you can get it for around 1000$, this config in particular, and ONLY if you need a laptop with specs like these and with a look like this, otherwise it would be great to ditch MSI completely. The other config with UHD display, 32GB ram, 1TB ssd ain't worth more than 1300$ in my opinion, mostly because of its 1 year age and because of the existence of not only new Intel stuff but Amd apus which are amazing. Sure, the ryzen 4000 are overall better but new Tiger Lake cpus have better single core performance for the same power consumption. Most day-to-day tasks are single threaded but i am not trying to justify Intel laptops' prices, at the moment Amd is better. Keep in mind that not everyone cares about their laptop's performance, it isn't even my top priority. Mines are: first, battery life & build quality at the same time, display (from now on, nothing under 16:10 ratio), performance, etc. Keep in mind that where ever you buy the laptop from, make sure that there's a return policy and that can you easily return in, just in case the unit you get turns out to be defective.
Again, just like last time, if you have any questions, feel free the ask them. Perhaps some of the questions can be answered in my original review but again, feel free to ask anything around the Prestige 15 A10.
submitted by robin_from_the_hood to MSILaptops [link] [comments]

2012 MacBook Air Not Booting

Hi there! I have a 2011 MacBook Air model number a1369 and it is having some issues. This is probably going to be a long post so be warned.
So... I recently bought a 2011 MacBook Air from a friend of mine for $50. It was that cheap because it was essentially freezing up then dying even when it was on the charger, so I don't think that it is a battery issue. The first thing I did was boot the Mac into safe mode and it was running fine. What I did next was I went into the Boot Recovery menu thing to clear the hard drive so that I could just reinstall a new version of Mac OS and also delete all the files and stuff, I did this with the hopes that the Mac OS that was installed was just broken in one way or another. So after deleting all the files the Mac defaulted to opening up in the Mac OS X Lion Base System menu. From there I tried to download from the internet system recovery but that didn't work, for reasons I now understand. After that, I loaded Mac OS X Lion onto a USB drive and used Transmac to format it correctly. I followed this video exactly: https://www.youtube.com/watch?v=cyYV0appyDE and it seemed to be working. The version of Mac OS I was trying to install was installing and everything was going smoothly for around 5 minutes. After that my screen froze on the OS install bar, and my whole computer was frozen. That is about all I can remember as of right now, but I am essentially just stuck with a completely wiped 2011 MacBook Air that I want any form of Mac OS on. I'm thinking that there might be something wrong with the physical hard rive but let me know what you think. If anyone could help that would be absolutely lovely. Thanks for reading!
submitted by yayitsbenji to mac [link] [comments]

youtube video downloader macbook air video

15 best YouTube downloader for macOS Big Sur, Catalina, Mojave, High Sierra, Sierra, EI Capitan that can download YouTube videos on MacBook Pro/Air in 2021. YouTube video downloader is a must-have tool for many video lover. Today, the best free one for Mac users will be introduced. Check it now! As a powerful free video downloader, this video downloader program supports downloading videos in whatever format or resolution, such as MP4, WebM, FLV, even 4K HD video. Dat laatste scheelt echt veel tijd én werk. Fijn is dat je ook andere bestandstypen kunt selecteren, zoals .wmv en .avi, waarbij je uiteraard controle hebt over de kwaliteit. Om een video te downloaden, kopieer je de url van YouTube (van de video of het kanaal) en klik je op Paste in het programma. As long as you load videos in the built-in browser, this Mac video downloader will take care of the rest. 3. Video Tips for Mac. Most video addicts like to enjoy abundant videos from video sharing sites including YouTube, Dailymotion, Vube, BBC, ESPN, Facebook, Vimeo, Hulu and other similar sites. Mac Video Downloader is a shareware web video application. It's a full-featured tool that can create, convert, and download web videos, but unlike freeware tools that only work with a few sites or ... New Articles. How to Repair Corrupted & Broken JPEG Photos with Free Tools; 6 Best File Recovery Software Free Download in 2021; 13 Best ISO Mounter Free Software for Virtual CD Drive in 2021 Step 1: Free download the MacBook Air video downloader. Turn to Windows version if you are a Windows user. Then click "YouTube" button, copy and paste the YouTube URL to the type box. Or click "paste & analyze" button to automatically detect the YouTube videos on the opening website page. The free video downloader is 100% safe and free for Mac users to download videos in 8K/4K, 1080p/720p HD at record speed and save video clips, playlists, channels, music, movies, TV shows, gameplays, cartoons, etc from Facebook, Vimeo, Dailymotion, and 300+ other sites.

youtube video downloader macbook air top

[index] [7579] [2051] [6851] [5717] [1200] [8713] [4632] [4350] [2703] [6755]

youtube video downloader macbook air

Copyright © 2024 m.playtoprealmoneygame.xyz