Hey Hive Peeps
Virtual concerts are an aspect that I think will definitely play a big part in the future metaverse experience. There are a number of companies already experimenting with some of the new tech that’s available in order to produce immersive live music events. One of the first ones that I experienced was on the Fortnite gaming platform by Epic Games. They hosted 2 events, one with Travis Scott and the other with Ariana Grande.
I think that they were both put together incredibly well so, hats off to epic games for getting in there as early as they did with an event of this scale.
One new company that has surfaced recently is Wave. They specialise in virtual concert productions and have created events with artists like Justin Bieber, John Legend and Teflon Sega.
Since that point, things have progressed quite far in terms of the technology that's being utilised in order to produce virtual concerts which are now really starting to become a thing, so let's talk a bit about some of the new stuff and extremely cool tech involved in virtual performances.
Fox has recently used the same motion-capture tech to create a kind of avatar-based X-Factor type TV show called Alter Ego. It’s a good example of the tech being used at scale within a TV production.
One of the more recent creations that I think demonstrates the technological progress of virtual performances was the Madison Beer Life Support Concert which was put together by Sony Music along with the development of a virtual concert hall for future events.
The whole concert including her 3d avatar was created using Unreal Engine using methods that haven’t really been used at this level, in this way, for this purpose before. Here is a very brief video by the team that shows a bit about how it was put together.
I think it's fair to say that a fairly big budget and some top talent was required to achieve this specific type of production as the level of fidelity was extremely high. However, it's also entirely possible to create things like this that utilise similar technology on a fairly low budget providing that you have the skills required to do it.
Someone who I feel is an extremely good example of what one person can achieve with the right skills and the right tools on a low budget is Corey Strassburger. He is a one-man band when it comes to using some of the new motion capture systems and software that are now becoming available for consumer use. One of the systems that he uses is the X SENS MVN ANIMATE sensor suit. It still carries a fairly hefty price tag but it's way less than the pro options like the VICON systems that are used for your average Hollywood applications.
The XSENS system is specifically made for the purpose of applying motion to 3D characters like those created using Blender and Unreal Engine. As you can see it’s also possible to apply the motion capture data in real-time to a character which is really cool.
It’s also possible to map any of the virtual cameras inside Unreal Engine to a wireless sensor so it can be used and manipulated like a camera in the real world in order to capture the scenes cinematically. The combination of all this tech will undoubtedly lead us into a world of digital media creation where actors and film locations are optional. It will also drastically bring down the production costs of making movies which is something that I will be covering in more detail in a later post.
Corey has been a very big part of my inspiration when it came to taking the leap with learning Unreal Engine. Here are some of the videos he created that really got me into the whole idea.
As you can see he is a real character and a very talented one at that, he has quite a few different vids on his channel which are definitely worth checking out if you fancy a good laugh and would like to take a deeper dive into how he makes all this happen.
I think that as things progress we will see all these different types of technology come together and play their parts in the overall metaverse experience. If one guy can create all this on his own in a few weeks I think the future is going to be very interesting indeed.
This concludes my post on Virtual Concerts, For now. Thanks for taking the time to read my offerings on the subject.
Man, it's crazy how all this is developing so fast, and I guess will just keep speeding up. Can't wait to see the impact it'll have on video games as for realistic graphics.
Yeah for sure. Things are moving at a really crazy rate at the moment. I think within the next year or so we are going to be experiencing some really crazy high-quality CGI used in everyday applications, gaming included. There are already a few things out there that make you stop and think is this real or CGI? Being an absolute graphics junkie I honestly can't wait to see what's next.
Its getting so unreal, and so quickly. A great read. It kinda scares me though, just how quickly everything is moving. I used to have my finger on the pulse when it came to tech, now its seems a lot of of it is so beyond my reach. But so incredible, what is being done in graphics now
Many thanks. I know how you feel. Doing all this research was a real eye opener. I think that covid has accelerated a lot of things in this domain. Some of the new graphics being worked on are incredible. I covered a lot of the new things that are being worked on in this post. Really mind blowing stuff.
Congratulations @cyberrascal42! You have completed the following achievement on the Hive blockchain and have been rewarded with new badge(s):
Your next payout target is 250 HP.
The unit is Hive Power equivalent because post and comment rewards can be split into HP and HBD
You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word
STOP
To support your work, I also upvoted your post!
Check out the last post from @hivebuzz:
Support the HiveBuzz project. Vote for our proposal!