Apple used an unreleased 8K Canon cinema camera to show off its Mac Pro, Pro Display XDR

During the launch of Apple’s new hardware and software yesterday at the World Wide Developers Conference (WWDC) 2019 an unreleased 8K Canon video camera was used to capture high resolution footage to demonstrate the quality of the new Pro Display XDR. The video camera was mounted on a robotic arm and was feeding the Apple display with 4:4:4 ProRes 8K raw video via an Atomos Shogun, according to tech YouTuber Jonathan Morrison who live streamed from the event.

During his video the Apple rep refers to the camera on the robotic arm as an ‘unreleased 8K Canon camera’ and from the clips it is easy to see it is designed in the style of the company’s C series. Interestingly, it appears to be mounted with a Sigma 18-35mm T2 in the EF mount. The relevant portion of the video is at roughly the 10:55 mark in the video below:

Canon has featured 8K demonstration cameras and displays in the past at its Canon Expo events, but this is the first time it has allowed the technology to be shown outside of the ‘showcase’ environment, and in a body form that we would recognize.

Comments

1. This apple vs windows arguments in every comment thread is getting old. I have worked alot on both Mac and Windows machines, my experience is that my windows machines are much more reliable. That is my personal experience and opinion. If you like mac better, that is fine. Each to their own. Trying to convert people from one "religion" to another with witty and demeaning internet comments will not work. :)

2. The monitors look sleek and modern. The Mac Pro, not so much. Is that just me?

Those chrome legs and handles against the aluminium grey cabinet are an eyesore. The cheesegrade front is also not that visually pleasing. If the case it self was matte black, the chrome would probably look more sleek but not sure that would be enough.

that is an excellent point .... matte black would have transformed the cheesegrater pattern into something futuristic , and importantly not directly visually reference [ for those with visual memory ] something used to grate cheese .... a silver rectangle with a pattern exactly replicating the holes of a kitchen grater is a visual mistake and something that cannot be unseen once pointed out

ive jonyey jones ive for all his talent cannot curate the wider implications of his visual invention , which is his major weakness , and apples , visually absent , visually clueless , stupidity , and its curation of important design decisions ,not worth a damn.

I think that there are some further moves here from Apple to keep Nvidia out. This isn’t that well reported, but there’s a feud between Apple and Nvidia which neither have really commented on in public. I think there have been several developments in the announcements from Apple this week that reinforces their position:

1. The new Mac Pro only comes with AMD GPUs. This is as expected as Apple stopped using NVIDIA GPUs around 2012 (the last 2 supported NVIDIA GPUs were the GTX 680 and Quadro K5000).2. Apple Afterburner appears to be Apple’s equivalent to NVIDIA CUDA cores.3. The Enhanced Gatekeeper in macOS Catalina will only allow Apple checked apps to be run. Since Mojave, Apple have stopped approving NVIDIA’s drivers for their latest GPUs and this is another potential way to block NVIDIA’s drivers.

...None of this will be of concern to the film studios that the new Mac Pro is aimed at. It’s a story that I’ve followed as I still have a 2008 Mac Pro running a patched High Sierra OS with a NVIDIA GTX 680 - it’s still a solid workstation to Apple’s credit. Apple have control of their ecosystem which has pros and cons, however they have let down their users of the previous Cheese greater Mac Pro by blocking NVIDIA’s latest drivers which I think is petty.

8K, you're going to be more concerned with viewing angle than screen size. in other words a headset or a theater makes more sense. Realistically though, 8K is needed to make good 4K output. Endstate, people are going to want to record at 2-4x the resolution they intend to produce.

@oveonite. At 100%, the text on a 32 inch 4k monitor is so small you almost need a magnifying glass to read it. At 150% it is normal and perfectly smooth so more pixel density is not needed unless you choose to live with your face 6 inches from the monitor. At 42 inches, a 4k monitor can be sensibly viewed at 100%, but the text is a little jagged (as it always is at 100%, so more pixels would be useful at that size. 6k would be plenty, so 8k is still overkill.

I said "almost". It depends on your vision, of course, but with 20-20 vision text is uncomfortably small at 100% in Win 10 at least. Don't know what a Mac does, but given the "helpful" nature of the Mac OS, it may adjust the text size automatically.

I measured the text on my screen for you. It's the same size as the text on the back of a credit card. I look at my monitor from approximately 18-24". Even without my glasses I can read it though my prescription is not that large.

"Can read" and "comfortable to read" are two very different things. If somebody handed you a copy of "War and Peace" with that size text, you would be complaining of eyestrain by the time you finished reading it (if you got there). The Windows default for a 32" 8k monitor is 150% and that is comfortable. 125% is usable, but tiring and 100% will wear you out if you spend a lot of time at the computer. My complaint with scaling is that images are also scaled, so unless you are using a program like LR or Photoshop (that doesn't follow the Windows rules), all your images will be scaled up and yo don't get the benefit of the high res screen.

@Foveonite. At 150%, the text is very sharp. A 32in 8k monitor would have to be scaled to 300% to have the same text size and that is simply a waste of pixels. At 43 in, you can make a case for 5k or 6k, but 8k is still overkill for almost all uses. That is not to say 8k wouldn't look nice, but remember, you will need a 33 MP 16:9 image to just fill the screen (without scaling).

Perhaps this is not so much an unreleased camera but a custom built model for Apple. Apple has the money to get what they want and it's probably not unrealistic for them to have a completely custom built device.

I can believe that Canon has a new 8K video camera coming out relatively soon. We know that Canon aims to have 8K equipment ready for the 2020 Olympics.

I can also believe that Apple has the clout to get Canon to allow them to use an unreleased model in their demos. In fact, it's smart for Canon to do so...look how it has people talking. And showing it off at an Apple event it will probably get more pre-release attention than most venues.

Canon has talked about 8k for a while. Several years ago here in NYC, the Canon expo, which comes every two years, or so, had video from an 8k video camera. They used their own 4K monitors to show it. They zoomed in on the video until a subject we couldn’t even see wide out, was taking up much of the frame.

8K video doesn't directly assume the framerate or bitrate. Could be closer to streaming out a moving jpeg format from the sensor. As long as it's 7680 pixels wide and sends 15 frames per second they could get away with calling it 8K.

I keep seeing people commenting that you can build a PC to match this computer but I think that this misses the point of who this computer is intended for.

Reality is, most creatives needing a machine like this are not computer engineers or even computer enthusiasts. They are photographers, graphic designers, 3D animators, videographers, and so on. I am a couple of these and after reading the components list of what is offered it came off as another language to me.

Apple buyers don't want flexibility, they want usability through efficiency. They want the most fully integrated user experience available, they want no hassle buying. They want to create and not be bogged down by complex tools that more often than not distract from the user experience that too many options create.

Also, I doubt you can duplicate this computer and monitor plus the user experience on any DIY PC for the same price.

well, then how you decide on what configuration you're going to buy ... because the lowest proposed by apple has some bad joke, and if you don't care about what SSD, CPU and other acronyms mean, you end-up with an inflexible super expensive low-high end mix computer, which will definitely bring some frustrations down the way.

Well, I imagine very few individuals are the customer base. These are meant for studios and there will be someone there who has the knowledge regarding the tech but the actual creatives won't care so much.

I also think you're overestimating the amount of knowledge what must have to pick out a computer to fit their needs.

A more relatable analogy would be someone buying a Sony A9 for wildlife photography based on the stated specs and reviews. This person doesn't need to know who makes the components or the model numbers of the sensor, etc. We trust that Sony has put together a camera that will do what it says it will do.

@thenoilif"I keep seeing people commenting that you can build a PC to match this computer but I think that this misses the point of who this computer is intended for."Complaints about the lack of configurability of the Mac Trash Can Pro is exactly why this one is upgradeable. And whilst the average creative isn't upgrading their computer no matter the system, Apple have to address several user tiers with one model. You are creating a more uniform picture of "Apple creatives" than truly exist. They range from those pray in the direction of Cupertino or have a shrine to St. Jobs to performance junkies.Actually, the user you describe will most often opt for the iMac instead of the Pro. No reason to venture towards the Pro if performance, and an understanding of it, isn't the point.

My point was simply that if one buys into a configurable anything (and the people in studios/companies that use computers bought/configured by others are not the case - they are users of the system yes, but basically have no say in the acquisition. If the boss decides to switch on Android video editing they either accept or change job) then he has to know what he buys, or ask help from someone who knows. And that includes Apple products as well as anything else.

@thenoilif"Well, I imagine very few individuals are the customer base. These are meant for studios and there will be someone there who has the knowledge regarding the tech but the actual creatives won't care so much."Whilst there are studios that have separate IT and creative departments, there are many where they cannot afford a complete separation of function but still need high performance machines.And my experience with larger graphic oriented companies is that IT often don't know what creatives need and the creatives have to educate IT.

Hp has a similarly configured model that’s $8,000, and Dell has one for $9,000. It’s interesting, but for years, Apple’s top machines actually come in lower than comparable machines from others.

But Apple has some unique features such as the accelerator board for transcoding and other high end video use that does 6.3 billion pixels per second that no one else has right now. That’s 3 streams of 8k video, or 12 streams of 4K.

@melgrossI’d have to see that to believe it. Specs can be deceiving if one doesn’t understand them. And Apple have consistently been higher priced when comparing like for like. Performance is the ultimate comparator and that is a little less straightforward. That said, the ultimate performance will elude Apple as long as they are wed to AMD graphics, one of the more common complaints from power users

@Thenoilif: Interesting analogy between Apple “Creatives” and Sony “Creatives”. I could construe your comment to mean that neither “Creative” can be bothered to understand their product beyond the most basic of levels. Which is fine, but also means their opinion is about as informed as an earthworm. People can call themselves “Creatives” as much as they want, but they should not be surprised if they occasionally see people making hand gestures akin to shaking dice behind their backs.

To clarify even more, if I was a pro wildlife photographer, I would look at performance specs of the camera to determine if it was the right investment for me. I would see 20 FPS no blackout as an appealing function from a camera. I would need to be bothered with the mechanics of how it accomplishes this as long as it does what it says it will do. If I buy it nd it doesn’t do that then I return it. If I buy it and at some point this functionality stops working then I return it and get it repaired. I don’t need to know how it works because I am not concerned with how to fix it myself.

This mentality is shared by people who don’t want the overly complex nature of modern technology getting in the way of their creativity. If DPR is any gauge, you can see how technology can interfere with the creative process as people obsess more about the machine than the actual input/output.

The future is exactly the point of this machine. It really is not about serving graphic designers as much as high intensity tasks like 3-d and video. Given OS Xs past in NextStep it also easy to see immersive environments and data mining as targets.

Looks like you need to see the design up close, I have seen some pictures that make it look like those holes are actually hollow spheres, not just some cheap mesh. And I als think this is an interesting look into the future.

At first I thought this Windows machine would work great for me.However the first problems occurred just 3 month after I bought it, then within 6 month after purchase that whole Windows machine fell apart by some stupid Windows update. Lost 2 working days to get it restored.

You guessed it - Sold it again.

Happily to buy a new Mac again.Live your dream that Windows is better.... Its just not my experience.

MacOs is better than Windows in the same way that for the average joe, a smartphone is better than a real camera because he doesn't know how to properly handle a real camera.

I've used Windows for all my life and forced to used iMac for 2 years while working as graphic designer for a company, couldn't stand how painfully limited and uncustomizeable MacOS is compared to Windows. My uncle and sister use iMac for gaming and the damn things overheat and throttle all the times. My PC on the other hand is rock solid, never overheated, never lost data and when I want to upgrade something I just swap out the part instead of being forced to buy a whole new computer.

"Creative professionals need tools to get their creative product out. They don't need to tweak their computer and want to spend time on making settings that are useless to them."Keep telling yourself that.

Funny, because aside from the initial setup, I never have to constantly tweak my windows setting. Mind telling me what do you have to constantly tweak in windows because I've used windows for far longer than you and yet can't remember having to. When I used MacOs I actually had to fiddled with around with MacOs whenever I want to customize something way more than I have to with windows.

Just buy a good antivirus software like ESET and be done with it.

And never had problem with windows updates, just set windows update to when shutdown after you've done all the works and go to sleep.

Microsoft has deleted the “customizability” that made Windows more usable. Where is the color-scheme editor that allowed you to set up a system-wide color scheme from 1991-200x? We didn’t have to wait 30 years for a vendor to dribble out a hard-coded “dark mode” on Windows... but now we do?

Problem with windows is windows update. After I installed w10 I have spent 1 hour to remove all that junk (metro apps and more) After one of the updates all that junk was back so I had to do everything one more.And every now and than internet explorer makes a return. I really don't like that windows is doing shi* behind my back

@ewelch - you started your comment well: "Ignorance of how a system works is no critique of the system" but continued with exactly the opposite :))

Simple truth is that all things can be done on both systems. And it's fine to prefer one or the other. And "Macs are so much more powerful once you learn how." and "[...] not so easy for visa versa." is exactly the ignorance you talk about.

Opinions are like cell-phones—everyone has one ;-) Here's what award winning Director of Photography Ben Allan ACS CSI has to say: "For anyone working with massively complex projects or very high resolution this machine makes things possible that simply weren’t possible or at least practical before." https://www.newsshooter.com/2019/06/05/hands-on-with-the-28-core-mac-pro/

The new Mac Pro is an incredible machine and really a bargain for what it offers.

Bravo to Apple!

Also interesting to see the new 8K Canon in a demo at the event...Apple has enough clout to merit Canon loaning one of its unreleased units to Apple.

Also, Nikon Rumors has an interesting speculation: during the Keynote, they showed 8K footage for a documentary by Ami Vitale, who is a Nikon Ambassador. It's likely that anything done by her would be filmed on a Nikon.

So this Apple even may have inadvertently hinted at new 8K machines from Canon and Nikon!

It is so much more than anything NEC or Eizo does for anywhere near the price. Reports are that it clearly is even better than the $43,000 monitor Apple mentions because it can only maintain 1,000 nits for six seconds, while the Apple display can run at 1,000 continuously and go to 1,600 temporarily.

Gosh, maMamiya, majority of photographers and video content producers use Macs. Bet half of the attendees are. They sat there. Watching the videos so are the rest at home. Nobody shrieked "EUREKA !!! That is 8k !!!" until DPR published it was Canon Unreleased video camera.

Nikon should have done the same on their Nikon DL Advanced Compact Camera Unreleased Version.

That goes to show nobody really can tell a 4K to 8K videos not until told so. That is life. That is psychology-at-work.

Not sure how they actually captured it. Only one cable is coming from the camera. If it is an SDI cable, you need 4 streams to capture 8k. Or was it an HDMI cable? Then you need HDMI 2.1. The Shoguns only have HDMI 2.0.

If they shoot RAW they only need one SDI cable. Someone on here just explained that to me. "While it is 12bit it is only one value per pixel insted of three. So in uncompressed form RAW is actually smaller than a demosaiced picture allowing for those data rates"

Sony user: Canon sucks, nothing but dated sensors and recycled bodies behind on technology.Sony user: That being said I do adapt Canon lenses. Also I miss their ergonomics, menu system, swivel touch screens, WiFi Bluetooth connectivity, GPS, flash system, color science and service - but outside of that they are just so behind.Canon user: Just Smiles

At least we found out who is still buying Canon cameras. Joking aside Canon loves this, while Panasonic is trying to put this in their mirror less cameras, Canon is dead set on keeping an artificially high priced line up of cinema cameras.

“You’re an idiot” actually means “I can’t find a good argument to counter yours, so I’ll just insult you instead”. Often when people don’t share the same view point they turn around and say that the other person is an idiot.

They need to upgrade the current 2013 Mac Pro to 2019 specs and keep the $2,999 base price since $5,999 base is way too high for consumers. I had the 2008 Mac Pro cheese grader and prefer the 2013 trash can since you could use an ultra fast external Thunderbolt drive for 4K and 8K video.https://eshop.macsales.com/shop/owc-envoy-pro-ex-ve/thunderbolt-3

I don't want a monitor with computer built in. I use dual 30" 4K monitors with my 6 core Mac Pro . There are many consumers using the current Mac Pro and I prefer it to a tower case like my old 2008 Mac Pro.

The Mac Mini uses mobile Mac /PC cpu's and graphic chips that are much slower than the cpu and graphics in the Mac Pro. I have an older Mac Mini and am satisfied with the performance of my current Mac Pro. Apple need something between the Mac Mini and the new Mac Pro that starts at $6.000 and goes to $40,000 loaded. The current 2013 Mac Pro with 2019 specs for a base price of $3K would do the trick.

No, you are mistaken. The 2013 Mac Pro only has USB 3.1 gen 1 and Thunderbolt 2.. Half the speed of Thunderbolt 3. Apple quit making them because they couldn’t handle the heat faster GPUs would generate and couldn’t be made adequately powerful.

The new Mac Mini can drive a monster GPU via Thunderbolt 3. The new Mac Pro is underpriced according to real pros out there who are the target market for the new machine. Compared to the competition that is.

Its fine you enjoy your cute computer but making up silly stories is silly.

By the time that computer can actually do any work it will be spread all over your desk with attachments like graphics cards and HDDs.

The competition is the same as always, much more compatibility, better cooling, infinitely more versatility, longevity, for half the money.

The only reason Mac is getting better on price than the old days is because they use Intel and AMD just like the competition. Rebranded monitors, proprietary ports and an apple keep the price falsely inflated.

Again its fine if you prefer it, but dont make up stuff that is easy to debunk.

They don't do anything else? You really don't know anything about Apple. Ever hear of iTunes? FileMaker Pro? (Database) They're putting a billion into movies and tv for their new service coming this fall. They make phones. They make watches. Holy smokes, you need to get your head out of the 90s.

Allegedly 8k is an option on the next C300, which is some long way (C500, C700) away from being their high end, so it may be somewhat affordable, in Cinema EOS terms anyway, well unless you buy the optional MIA robot...Also apparently 444 ProRes 8k Raw recording,

Horshak, all the video experts out there are gobsmacked at the statistics of these monitors. Time will prove whether the numbers are accurate. But the people who have actually used them already (professionals that Apple let use beforehand) all say they are without peer for anything anywhere close to the price. And even compared to the $43,000 Sony reference monitor which can only hold its peak light level for 6 seconds (Apple's can go indefinitely at the same level and go up to 600 nits higher for short periods). Chances are they are going to be bought in massive quantities.

@ewelch, The Apple monitor has eye-catching marketing specs but falls short where it matters most, specifically using a multi-segmented LED backlight instead of OLED. That's what gives the monitor its impressive brightness but at the expense of uniform brightness and contrast, which are essential elements on a reference monitor.

Horshack, That is specifically the monitor Apple is comparing their monitor to. It can only sustain 1000 nits for six seconds. So expensive it can only be the last monitor in the workflow, whereas Apple's monitors are so inexpensive everyone can have a reference monitor rather than just the last person.

@ewelch, And as I said, the Apple monitor uses multi-segmented LED backlights to achieve that brightness, at the expense of brightness and contrast uniformity offered by OLED and required by pros in a real reference monitors. The Apple specs are impressive but they are consumer specs and not suitable for critical reference monitor work.

Horshack, Such claims are dated and no longer apply if what Apple and the pros who have used it know what they're talking about. OLED is not necessarily superior. That's how it's been done to date. And apparently it's no longer state of the art unless by that you mean the most expensive. OLED has its own share of issues that can't be ignored.

@ewelch, Claims are dated? How so? I'm not talking about OLED in general hype terms but specifically how the technology makes it the current preferred choice for reference monitor usage. If you put an OLED next to a multi-segmented LED backlit monitor you'll see obvious difference in uniformity. There's no way to escape that - the segments of the LED, no matter how numerous, are coarse-illuminates for the pixels they are designed to turn on.

Horshak, Yes you are talking about OLED in general. Have you even bothered to read what Apple has said about how they have designed their monitor? Because it's not your typical LED backlit monitor. There are 10,000 blue LEDs, each individually controlled and passed through some kind of new filter to make the light perfectly neutral in color and even in distribution at such high levels. This is not your granny's LEDs.

@ewelch, I've read it. It's still a multi-segmented LED backlight monitor. It doesn't matter how many segments it has - it's still far coarser than having each pixel self-illuminated as in OLED. All those extra segments do is reduce the coarseness. Again, consumer technology vs professional.

@ewelch, My evidence is basic logic. A multi-segmented LCD display illuminates and blocks groups of pixels, the number of which is based on how many backlight segments the display has. The more segments, the more selective the illumination of pixels can be, which improves uniformity and local contrast. Compare that to OLED, where each pixel is self-illuminating, which means it has perfect selectivity. A multi-segment display can never match OLED because no matter how many segments its backlight has it will never come close to OLED's individual pixel selectivity, let alone match it.

Your opening sentence on that comment discredits what you say. Logic can only go so far without hard evidence to back it up. You have not even touched Apple's monitor. You can't objectively say anything about it. Thus your own statement discredits itself.

@horshack - his observation is correct. You seem more concerned about hating on Apple for the sake of it than to acknowledge that Apple may have obliterated the reference monitor market if this does indeed work as advertised.

I think the desktop is about to make a bit of a come back, I just put another 32GB of ram (64GB in total) in my linux box to make processing drone footage easier and actually 1.5TB seems almost reasonable. Managing this stuff on a laptop or the cloud, atleast for me, has always been a little stupid, now it's very stupid.

first I use matlab for everything from echosoundings to video footage, I do alot of work interchanging time and space and although with careful memory management I get away with less than 1 GB of ram it's much easier with more, the more time and space I can load into ram at a time the broader the sprectrum of time and space I can analyze. Your average photographer could learn how a computer works and use less than 1 GB. Today I am looking at drone footage a private company collected for me (1.2TB of 30MB geotifs) of a frozen lake, I want to be able to look at all scales of variance from the whole lake (several km trends in ice color) to a cm scale holes. basically I want to be able to look at the whole 1.2TB composite with the ease of looking at a single 30MB image.

If you go to google scholar and search for: Tedford Holmboe, the first paper in the list has a good example of what I do in the lab, specifically in Figure 7 I separate rightward and leftward propagating internal waves using a time-space 2d Fourier filter (some details on the imaging are in the third to fifth paragraph of section 3). Lately I've been working on pit lake, that's where the drone footage comes. there's a couple of papers in the google scholar list for my name, but nothing yet with the drone footage.

Or buy a $100 VESA stand and attach the Apple monitor to it. Not as flexible for movement and putting it in the perfect position and angle, but then you saved $900 on a monitor that beats $50,000 monitors. Yeah, that makes sense.

Latest in-depth reviews

The Canon G5 X Mark II earns a Silver Award with its very good image quality, flexibility and the overall engaging experience of using the camera. However, if you need the very best in autofocus and video, other options may suit you better. Find out all the details in our full G5 X II review.

360 photos and video can be very useful for certain applications (as well as having fun). The Vuze+ is an affordable 360 camera that supports both 2D and 3D (stereo vision) capture, and might be the best option for someone wanting to experiment with the 360 format.

The Mikme Pocket is a portable wireless mic with particular appeal to smartphone users looking to up their game and improve the quality of recorded audio without the cost or complexity or traditional equipment.

The 90D is essentially the DSLR version of the EOS M6 Mark II mirrorless camera that was introduced alongside it. Like the M6 II, it features a 32MP sensor, Dual Pixel AF, fast burst shooting and 4K/30p video capture. It will be available mid-September.

Latest buying guides

If you want a compact camera that produces great quality photos without the hassle of changing lenses, there are plenty of choices available for every budget. Read on to find out which portable enthusiast compacts are our favorites.

Whether you're hitting the beach in the Northern Hemisphere or the ski slopes in the Southern, a rugged compact camera makes a great companion. In this buying guide we've taken a look at nine current models and chosen our favorites.

What's the best camera for under $500? These entry level cameras should be easy to use, offer good image quality and easily connect with a smartphone for sharing. In this buying guide we've rounded up all the current interchangeable lens cameras costing less than $500 and recommended the best.

If you're looking for a high-quality camera, you don't need to spend a ton of cash, nor do you need to buy the latest and greatest new product on the market. In our latest buying guide we've selected some cameras that while they're a bit older, still offer a lot of bang for the buck.

Whether you're new to the Micro Four Thirds system or a seasoned veteran, there are plenty of lenses available for you. We've used pretty much all of them, and in this guide we're giving your our recommendations for the best MFT lenses for various situations.

Blackmagic has announced an update to Blackmagic RAW that adds support, via plugins, to Adobe Premiere Pro and Avid Media Composer. Blackmagic also announced a pair of Video Assist 12G monitor-recorders with brighter HDR displays, USB-C recording and more.

Sony has announced the impending arrival of its next-generation video camera system, the FX9. The full-frame E-mount system is set to be released later this year with a 16-35mm E-mount lens to follow in spring 2020.

The Canon G5 X Mark II earns a Silver Award with its very good image quality, flexibility and the overall engaging experience of using the camera. However, if you need the very best in autofocus and video, other options may suit you better. Find out all the details in our full G5 X II review.

The Fujifilm X-A7 is the newest addition to the company's X-series lineup. Despite its relatively low price of $700 (with lens), Fujifilm didn't skimp on features. Click through to find out what you need to know about the X-A7.

The entry-level Fujifilm X-A7 improves upon many of its predecessor's weak points, including a zippier processor, an upgraded user experience and 4K/30p video capture. It goes on sale October 24th for $700 with a 15-45mm F3.5-5.6 kit lens.

Robert Frank's unconventional approach to photography and filmmaking defied generational constraints and inspired some of the most influential artists of the 20th century. He passed away today at age 94.

All three devices offer a standard 12MP camera plus, for the first time on an iPhone, an ultra-wide 13mm camera module. The 11 Pro and 11 Pro Max also retain the telephoto camera of previous generations.

Phase One's new XT camera system incorporates the company's IQ4 series of digital backs with up to 151MP of resolution and marries them to a line of Rodenstock lenses using the new XT camera body. The result is an impressively small package for one of the largest image sensors currently on the market - take a closer look here.

Phase One has announced its new XT camera system, which includes an IQ4 digital back, body (made up of a shutter release button and two dials) and a trio of Rodenstock lenses. The company is marketing the XT as a 'travel-friendly' product for landscape photographers.