AmberCutie's Forum
An adult community for cam models and members to discuss all the things!

MFC's "higher" video quality

  • ** WARNING - ACF CONTAINS ADULT CONTENT **
    Only persons aged 18 or over may read or post to the forums, without regard to whether an adult actually owns the registration or parental/guardian permission. AmberCutie's Forum (ACF) is for use by adults only and contains adult content. By continuing to use this site you are confirming that you are at least 18 years of age.
Status
Not open for further replies.
Lag aside, one of the bigger issues that seems to be cropping up on occasion is the video failing to completely load/render, resulting in...well, pretty much this.

YKsKT.png

It tends to go away after a few seconds, during which time the audio continues normally.

We call it Greyg.
FUCK YOU, GREYG.
 
I'm a little surprised at the lack of attention this subject has!

Probably because there is no indication as to who is using it or not on the homepage. They used to have the "HD" next to names of models using the 640x480 setting, but they dropped that a while back. So, honestly, unless you happen to click on a cam using it, you won't know how to find it.

Compressing video is hard work for a CPU. And a higher resolution takes much more out of a CPU. (Just ask any model who regularly edits high quality 1080p videos and has had to sit around waiting for the h.264 encoding to happen.) Many cheap laptops have CPUs that just aren't up to the task. I think a lot of people are going to run into trouble with this..

Yes, most, if not almost all laptops and all-in-ones (like iMacs) are not designed to encode video at high bit rates. I used to do 1080p 60fps videos on my iMac, and it was painfully slow to export. Moving to my new desktop with far superior specs has changed that; I now can export a 1080p 60fps h.264 video in just slightly longer than the actual run time of the video. Thing is, most cam models I see use laptops, not desktops; both for mobility (shower shows, etc.) and the fact that they cost less than a decently powered desktop. Still, now that the GTX 1080 is out, the older 980 and 970 cards have come down in price. If for nothing else besides video editing, models who do everything on their laptops or all-in-ones should consider buying a mid-range desktop with a GTX 970 for video editing, especially if they use Adobe Premiere (it takes advantage of the nVidia Cuda Cores). A decent i5, 8GB RAM and a GTX 970 shouldn't come to more than $1000 with a case, power supply, and monitor. They don't need to spend $3k+ (after tax) like I did on my system.

I dunno if I'd like the look of widescreen vid on MFC.

Widescreen looks fine on SM and CB. Main thing would be to change the chat box size. Besides, even on SM and CB, there are 4:3 cams and the cam window auto adjusts to there are no black bars, but a native 4:3 cam window. I posted screenshots of 16:9 SM and 4:3 CB cams in another thread.

The big thing would probably be MFC needing to add in some sort of optimizer for models. Something where they test their connection and the site tells them what the max video resolution they can use, based off of internet speed and maybe some sort of hardware test. 3Mb/s+ upstream is becoming more common in the US, but it obviously still isn't as common as it could be. But models will also have to remember that upstream is almost never constant, and that if they aren't the only ones in the home using the connection, that that will affect them, as well.
 
I wonder how important having a dedicated graphics card is for this? I've got a 1 gb one (I need to upgrade, >_>), I'm wondering if that's why it went smoothly for me.

Here are my comp stats:
Intel i5 dual core processor
1 GB Gigabyte NVIDEA GeForce graphics card
12 GB RAM
Logitech c920e
Hardwired, 20 up/80 down
Chrome Web Broadcaster

:confused: You have 12GB of RAM, but only a 1GB video card?

When you upgrade the video card, you may want to upgrade your CPU and motherboard. You don't NEED an i7, but a quad core i5 would be good, but I don't know if they will work with your motherboard.

I'm not a serious gamer, which is usually where I hear about 60 fps, so I'm a bit ignorant on this. But tv's standard is 30 and film is 24, so why would it be necessary to go all the way up to 60?/

Smoothness of motion, mainly. If you compare say Destiny at 30FPS with any other FPS running at 60FPS, or higher, you'll see the difference in how smooth it looks.

Oh gosh yes I am so happy people can actually read my text on the screen now! I struggled with that before.

ManyCam is a huge system resource hog. Have you tried using Open Broadcaster Software (OBS), instead? You can still put text on the screen with it, though it might take extra work to make it animate and such; but I think it also uses less system resources than ManyCam.
 
Last edited:
ManyCam is a huge system resource hog. Have you tried using Open Broadcaster Software (OBS), instead? You can still put text on the screen with it, though it might take extra work to make it animate and such; but I think it also uses less system resources than ManyCam.
That's not an option right now, though during the testing I did for them I mentioned that many girls have discussed wishing they could use it on MFC.

I have a love/hate relationship with ManyCam and may look at the other compatible options soon. As we speak, I am having issues with it not working with the most recent Windows 10 update on Chrome (grr!) so I'm using Firefox as a backup until that's fixed. But as I noted, the look of my text improved with the new video quality on MFC, so my initial complaint is moot now.
 
The main argument for those higher framerates is that they deal better with fast-paced content (sports, action sequences, etc), with the downside is that they look/feel 'weird' due to the 'soap opera' effect - we are used to 30/24 fps, so anything faster on those contexts makes our brains unhappy (until you watch enough of it to retrain your brain).

The main argument for the higher framerates is marketing. It's a way for electronics companies to sell you that shiny new TV and make you believe your old one is obsolete.

The framerate for TV's is a technical one. 30 in US, 25 in Europe. Which is half of the net frequency. It's just easy to make electronics that halve the frequency, it's harder to do any random frequency. This is of course a leftover from a time where TV's were TUBE based and analog and didn't have transistors (or IC chips for that matter).

The framerate for movies being 24 is a compromise. How to get a good quality without using so much reel of film. Especially since the amount of film you can put on a reel in a movie theatre is limited and they want to keep the cost of film and amount of reel changes down.


Dunno if that makes sense - 15fps is essentially a fast slideshow (and members already complain about laggy-looking feeds)
BS!
Your brain is perfectly happy with 15fps. The reason for the above framerates is NOT so much the amount of movement, but the light flicker that used to be part of TV's (low grade phospors) and film (flicker moving from one image to the other).
If the screen stays illuminated (as they do on PC's playing video), then everything is fine and your brain fills in the gaps.
For the stuff models typically do, sitting, talking, just some 'slow' moving around, 15fps is adequate (though not ideal for every type of show).
 
Smoothness of motion, mainly. If you compare say Destiny at 30FPS with any other FPS running at 60FPS, or higher, you'll see the difference in how smooth it looks.
smoothness. not really. See above :)

In a game, a higher fps has benefits, but that hasn't so much to do with smoothness as it has to do with shortening the time on the render pipeline and give you a faster response to input.
 
I'll be honest, I stopped using it and went back to the CB trick. The grey screen mixed with the sound issues mixed with the very low fps mixed with not being in actual HD anymore made it not worth it to me. I'll probably use it on my laptop those few times I stream from it though, because the CB trick eats up my CPU. Hopefully they'll work out the kinks soon! But it is really, really nice to see most streams on MFC in a higher quality. ^_^
 
  • Helpful!
Reactions: Puffin
Does anyone know if the private show archives are in the higher resolution too if the model uses that? I did one private since the change but the archive was still 320x240, but I am not sure which resolution she was using.
 
BS!
Your brain is perfectly happy with 15fps.

smoothness. not really. See above :)

In a game, a higher fps has benefits, but that hasn't so much to do with smoothness as it has to do with shortening the time on the render pipeline and give you a faster response to input.

I disagree. 15fps is noticeable and distracting. And higher frame rate has everything to do with 'smoothness.' To me 15 fps tends to give me eye strain almost to the point of getting a headache. It is especially noticeable when things are moving faster on the screen. If a model is dancing, pole dancing, hula hooping, jiggling her boobs around... those types of things would be much worse at 15 fps.

This video shows a simple side by side demonstration between 15 fps and 30 fps. The 15 is more jittery and jarring to watch.




And this video has multiple people playing Minecraft at 10, 30, 60, and 144 fps. Smoothness is how they defined all of them. And they all agreed, the lower fps is not something they liked. They were not perfectly happy with it. Frame rate is rather important.

I'm not sure if the player media coding here on the forum will have this show up at the 60 fps they recommend watching this at in the video. If not go to youtube and select the settings gear and select that to see it properly.
 
The option of being able to use 15 fps, is when a model might not have the bandwidth to handle 640x480 @30 fps.
Would you prefer watching a model video feed that is dropping a lot of frames? @640x480 30 fps or a compromise less bandwidth hungry 640x480@ 15 or perhaps drop it right back to 320x240 where you can't even see the hair on the end of her wotsit

Sure they should not make it a default but I see no harm in it being an OPTION
 
  • Like
Reactions: Olivier314
I can't remember if I updated this thread when they added in the new bandwidth options. This is what we've got now:
Screen Shot 2016-08-15 at 11.09.03 AM.png
 
  • Like
  • Helpful!
Reactions: Smores and Puffin
I disagree. 15fps is noticeable and distracting. And higher frame rate has everything to do with 'smoothness.' To me 15 fps tends to give me eye strain almost to the point of getting a headache. It is especially noticeable when things are moving faster on the screen. If a model is dancing, pole dancing, hula hooping, jiggling her boobs around... those types of things would be much worse at 15 fps.

This video shows a simple side by side demonstration between 15 fps and 30 fps. The 15 is more jittery and jarring to watch.
I didn't say there is no difference. On faster moving video there obviously is. But then we're talking stuff like sports, races.
In the video you like, the 15fps 'simulated video' (which isn't quite the same as actual 15fps) looks very acceptable to me. Is the 30fps one better, yes, ofc.
The side by side comparison for some reason is just horrible, I'm not sure why, maybe it was just made badly, neither the left nor the right image look as smooth as their full frame equivalent in the first half fo the video.

15fps is not ideal, and I never claimed it was, but it is a better fallback than dropping resolution. (for video)

For games, dropping quality features like antialiassing or using lower quality mipmaps, reducing shadows are 'obvious' ways to keep up the framerate dynamically.
And the 'smoothness' in the games is a bit of a bad perception issue. If you have a lower fps, you will also tend to have a longer delay between input and seeing reaction to that input on screen (some games can compensate to a degree). This delay between pressing a button, or moving the mouse and seeing something happen on screen will be perceived as 'smoothness' when it really has little to do with it. We've done plenty of user tests on this. If you have a high framerate, but introduce this input delay explicitely, players will complain the game isn't as smooth, even though video quality is unchanged.
And before you say that framerate itself matters in games. Ofc it does 15 is too low for any decent game. But framerates higher than the hardware refresh rate of your monitor just make no noticable difference in "smoothness", though it does help with input reaction time. Players claiming they do 'see' this, are just misinterpreting the shorter input delay.
 
Until MFC looks like Hobbit in HFR...

...which was totally awesome and the only way to see them in theaters, I thought...
 
... but far from real HD (1920x1080).

Nerd post incoming....

For what it's worth, there are a lot of HD resolutions defined that people aren't aware of, all of them are technically real HD:

nHD - 640x360
qHD - 960x540
HD - 1280x720
HD+ - 1600x900
FHD - 1920x1080
QHD - 2560x1440
4k UHD - 3840x2160
5k UHD+ - 5120x2880
8K UHD - 7680x4320

MFC calling 320x240 HD (that's actually QVGA and is most commonly the resolution you'd find on flip phones) though is laughable. Whoever thought that was a good idea should not be developing websites. Using that resolution, however, is understandable. The higher the resolution you use the higher a bitrate is needed to prevent a loss in visual quality. Most MFC "HD" streams are 320x240 using the AVC-1 video codec (H.264) encoded at a constant 1 megabit per second and a variable frame rate, and the Speex codec encoding at about 42 kbit per second, packaged inside the flash video container.

AVC-1 is the most widely deployed video codec at the moment. If you have a modern satellite or cable receiver, chances are good that all of your HD channels are encoded using AVC-1, as is everything you'll find on YouTube. Your cell phone uses it as well, as do apps like SnapChat. It's also what Blu-Ray discs use. It's going to be replaced by HEVC (H.265) eventually.

MFC uses Speex because it was designed to encode the frequencies of human voice efficiently (and it's free). Even though you can achieve pretty good quality with a $100+ microphone, the reason you notice that audio quality on MFC doesn't compare to other applications and sites is because of that. This is essentially a Voice over IP codec, it's meant to transmit voices, not other sounds.

The reasons for using a low video resolution and bitrate and two-fold.

1. Bandwidth. The lower the bitrate the less money has to be spent on streaming to site viewers.

2. Server and model computer resources. Higher resolutions and bitrates require more time to encode, which then requires faster and more expensive servers to do transcoding in real-time. I'm assuming that whatever the video and audio is as it's streamed to MFC from the model, that MFC transcodes it into something else. If that's the case then a step up to a new resolution and bitrate could require a large expenditure on servers and bandwidth for MFC that they may not see as a good return on the investment. If that's not the case then we're talking about increasing the computer requirements for the models which is an even bigger investment as you'd risk losing models and the revenue they generate.

Some cam sites probably use cloud platforms like Amazon AWS and can very easily (trivially, if they designed their software well) increase the size of their server farm. If they really designed their software and business well, it could be as easy as changing a config setting in their software and then spending a few minutes on the Amazon control panel to use more servers. If they didn't, these changes could require rewriting *all* of the back-end software and then buying and installing new servers with a co-location company *by hand*.

Anyhoo, I think it's pretty sad that most MFC streams are still QVGA. There's no excuse for that in 2016. They should invest in infrastructure upgrades that at the very least allow 720p as an option for high earning models (or as a paid option for models that want to foot the bill), and ffs, support widescreen video capture. This is not especially difficult technology.
 
BS!
Your brain is perfectly happy with 15fps. The reason for the above framerates is NOT so much the amount of movement, but the light flicker that used to be part of TV's (low grade phospors) and film (flicker moving from one image to the other).
If the screen stays illuminated (as they do on PC's playing video), then everything is fine and your brain fills in the gaps.
For the stuff models typically do, sitting, talking, just some 'slow' moving around, 15fps is adequate (though not ideal for every type of show).

What you're describing is the flicker fusion threshold, and it varies depending on what you're looking at and an individual's physiology. Some people have a different threshold with the same sample of light, and different samples of light can alter the threshold within people with similar vision and processing abilities.

15 FPS being adequate is a personal preference and has nothing to do with what the human brain can perceive. The human eye can process up to 1,000 changes per second but our brains without any kind of training can only perceive up to 140. With training that apparently can get as high as 220-240. For most people it's between 50 and 90 which is why video game consoles sacrifice visual quality for being able to consistently achieve a frame rate of 60 now, and 30 in previous generations. For most people anything faster than 60 won't be noticeable and as pleasant as constant 30 FPS can be, regular swings between 30 and 60 can be quite unpleasant. Thomas Edison got very, very close when he said a minimum of 46 FPS was needed. This has been understood since the 1920s.

Higher refresh rates on monitors can definitely be used for marketing purposes, but the benefits of higher rates are both real and perceivable.
 
  • Helpful!
Reactions: Puffin and fandango
MFC calling 320x240 HD (that's actually QVGA and is most commonly the resolution you'd find on flip phones) though is laughable.
Do they call it that, though?

On our MWB they just call it "Normal Resolution (default)".

I think at one point maybe they did that "HD" icon on homepage previews/avatars but I believe they've done away with that and have updated their terminology to merely say "higher" resolution on the 640x480 and 800x600 feeds. Which is true, because it is "higher" than they offered previously, lol.
 
Status
Not open for further replies.