I want to buy this TV used. I'm seeing a bunch of Samsung, LG, RCA, Sony, etc on Facebook Marketplace. What a cesspool Facebook has become, right?
Any suggestions on the best brand or even model for that kind of thing? I don't really want to battle with a bunch of shit that tries to coerce me to install another app from a streaming provider slash gambling entrypoint.
I imagine mostly it will just need HDMI to work, and all the TVs will support that. But, I thought maybe there would be a fun brand that offers interesting other options.
Text is very readable, refresh rate is good. It uses the same panels as the fancier G series in the larger sizes. One can root the firmware to make it go brighter. (Though this is screen works well in medium or dimly lit rooms. It does not shine in very bright rooms).
Plenty of YouTube videos singing the C series praises as a TV / Monitor.[1] LG webOS is also trivial/friendly to root in developer mode and network control of the tv is a nice to have.
Would avoid Samsung. I love the matte on the Frame and the design of the Serif but the OS is frustrating / impractical to root.
[1] https://youtu.be/Qtve0u3GJ9Y
Do not get a Samsung...
I have not looked into hacking the firmware to change this behavior but if there's a "custom rom" out there that can do this, I'd appreciate a link!
One of the best things about LG in general is their serial port. It's hit/miss which of their models will have it exposed on the back, but if yours does, the protocol is well documented and is very simple.
My LG TV (used as a monitor) is really chatty on the network and so I keep it disconnected so I don't get periodic interruptions from little overlays telling me that $someApp has been updated and needs me to agree to new terms (yes, really!).
To re-gain remote control for automation, I use the serial port. I have an ESP32 connected to a mmWave sensor for active "at desk?" detection. This is integrated with Home Assistant which knows which PC my KVM is pointing to and if it's on or not. This lets me re-implement basic "if not at desk and no PC is on, put the display to sleep" automation.
My biggest complaint is more of an ecosystem issue; why is DisplayPort not common on TVs? Because this TV-As-A-Monitor is HDMI only, my KVM has to be HDMI and so does every PC that's hooked up. Would have been a lot nicer if the whole chain could be display port :/.
WebOS is trash too.
Probably going to buy a Sony next time.
You can disable most of the WebOS trashiness by Googling and digging through the settings. Once you get all the ads fully disabled, the OS is extremely clean and snappy.
And FYI Sony's OLED panels are made by LG. The Sonys are a bit better because of the software, but they're almost always more expensive, but if you can score a good deal they're definitely the way to go.
I just can’t understand why there is a need for a remote app to do anything besides start to a tv remote. Well, I can, the poison apple of advertising as an additional revenue stream, but it’s still infuriating. I have an older Roku TV and that app has progressively gotten worse. it used to just be ideal, start, auto connect to the last connected tv, and immediately go to the remote. Now it’s a bunch of promotional content by default and you have to tab over to the remote. LGs is far more obnoxious and difficult to navigate. Absolutely inexcusable for displays that can cost $2500+
And don't even get me started with creating an LG account just to get anything working on the TV like downloading a system update.
I have a C1, and I got the technician's remote to try this. But it didn't work in my case - it seems that only some of them use the same hardware, probably based on supply chain needs. Still though, amazing screen. Takes a bit messing around with picture settings (there's some good guides online) but I've never found the "TV" parts to get in the way, just connected it via HDMI, put it in PC mode, disable wifi, and it's good to go. I guess I've been using it around 4 years now.
The only serious issue is the shininess of the screen. It's not terrible but I did have to rearrange my office a bit to make sure it wasn't facing a window.
Bought and connected an apple tv, always switch on the tv with that. Most problems solved.
I personally love the Art Mode, but while browsing the service menu I've noticed that you can permanently disable it. You can make the secret menu appear by pressing some special combination or by pressing 2 buttons on the service remote[0].
[0] https://www.amazon.com/AA81-00243A-Replaced-Service-Control-...
I actually like the idea of art mode, but I'd only want to use something like that if it were a passive technology like e-ink. Otherwise I think the electricity use and wear and tear on the display would eat at me. The device is well built and the presentation is lovely, but I just can't stomach the idea of it burning electricity all the time. (I don't know what its standby draw is, sadly. I do have a lot of stuff on power strips because I worry about standby draw. You're making me realize that this TV, being built-in to the wall, has escaped that scrutiny.)
(In Proton's Wireguard Configuration Wizard, I've selected "Block malware, ads, & trackers" - see: https://protonvpn.com/support/netshield)
At that point, you need more complex routing than what a simple DNS blocklist can provide via Pihole, and if you want good throughput, you're going to want real networking hardware and not a RPi.
What is the method you mention? A top google result seems to be [1], which says
> All release versions of webOS 9 ("webOS 24") are patched. This means 2024 models and older TVs that have been upgraded to webOS 9 will require another exploit such as faultmanager-autoroot [2].
and [2] says
> As of 2025-08-24, the latest firmware for essentially all LG models running webOS 5, 6, 7, and 9 is patched.
[1] https://github.com/throwaway96/dejavuln-autoroot
[2] https://github.com/throwaway96/faultmanager-autoroot
Is this firmware bit flip known? couldn't find anything off google.
edit: Apparently I specially have C3PUA according to the model data I added. Also if anyone is interested in this, I can update the README because I didn't change it after I forked it.
Apparently the only fix is to disable it in your source, but it works like 75% of the time and I'd hate to lose the excellent picture quality of Netflix and YouTube via Google TV.
YMMV.
I went with Samsung QN90C instead and I'm super happy with it. It's very bright, fights glare well, and there's Jellyfin for it.
I initially did it for Jellyfin before they made it into the official app store, but the Moonlight game streaming app has unlocked many hours of entertainment.
1. https://cani.rootmy.tv
2. https://www.webosbrew.org/
https://github.com/satgit62/How-to-Install-and-set-up-Ambili...
doesn't need to go through another device to capture the HDMI, it's built right in!
Only time they get used is when I'm playing Fortnite. I had Huenicorn set up for NixOS, but I haven't bothered trying again in SteamOS.
I use Moonlight via direct 1 gbps Ethernet from a high-end gaming PC in the same house through a Google Chromecast 4K HDMI dongle with a powered USB-C hub for the RJ-45 input and it works flawlessly at 60 fps 4K 10-bit HDR with around 12 ms video latency. Some USB 3 hubs and USB Ethernet dongles won't reach full speeds on some streaming devices USB ports. The second one I tried worked at full 1 gbps.
You have to verify every software and hardware component in the chain is working at high-speed/low latency in your environment with a local speed test hosted on your source machine. I used self-hosted OpenSpeedTest. Moonlight works great but none of the consumer streaming stick or USB hub/RJ-45 dongles test for high speed/low latency across dozens of different device port hardware/firmware combos - so you can't trust claimed specs. Assume it's slow until you verify it's not.
I'd say definitely give it another go.
UnRaid + KVM VM + GPU Passthrough with Moonlight has meant I no longer have to dual boot to game.
60FPS at 1080p on a 4k screen. 4k struggles but I think that's more my GPU then anything else. I do have 2x of them.
I guess you can mitigate that if you use something like a pi-hole? I do wish there was a solution using root/devmode to block ads (or better yet, run in whitelist mode!).
However, if you do have an pihole/adguard home, this list does get rid of all the ads: https://gist.github.com/d4kine/b2458cc9d693d7d36193be0247094...
Still, would love an "opensnitch" in whitelist mode for my TV!
But it has worked blocked the ads since 2023, so that's something.
https://pro-bravia.sony.net/develop/app/getting-started/inde...
I use mine as a dumb TV but the built-in smarts are serviceable.
https://www.home-assistant.io/integrations/braviatv/
here's a nice reference for a lot of the stuff installed on bravia that you can elect to remove via adb:
https://github.com/therealhoodboy/skinny-bravia
I have 20/20 vision, and I really can't tell the difference between 1080p and 4K for video games and movies. I will never do below 4k again on a desktop, but 1080p is more than fine for a TV. Higher framerate makes a far bigger difference than higher resolution for video games too.
HDR is indeed effectively a marketing gimmick on many cheap TVs. They are getting better though
It’s why even non-4k BluRays sometimes look better than streaming.
Such as it is, I use 3x 1080p displays. It's fine for me, and approximates a larger curved super-wide display (while also being cheap). She does just fine with 1080p resolution however - rarely has more than 2-3 windows on screen at a time.
Not everyone suffers from FOMO.
I've only seen one movie that was worth the bother and expense of seeing it in 4K (Rear Window).
The rest of the things you mention are mostly for a very small slice of theoretical people with perfect vision in perfectly lit rooms at the perfect height and viewing angle.
Beyond icons on a sticker checklist, they mean nothing to the 99% of people who just want to watch sportsball or eat popcorn while watching Disney films with their kids.
You can put lipstick on a pig, but most people are still watching pigs.
You can scan film into whatever digital resolution you want. You could do an 8k scan if you felt like it. You might run into issues where the resolving power of the film is less than the scan, but 4k is not an unreasonable resolution to pull out of well lit studio shot movie stock.
Or something like that. Someone more in the know please check my math.
I can remember when the Nintendo Wii came out, and people I know were damaging things when the remotes would go flying. It's like the Wii release every day in a house with kids. My brother-in-law is on their third TV in 5 years.
For all practical purposes, it is just a dumb HDMI display attached to my computer.
My opinion --- in some cases, the difference between expensive and cheap boils down to the picture controls being intentionally limited for marketing effect.
So the cheap model maxed out looks like the more expensive model at medium. People can recognize the difference in the store so they opt for the more expensive one. But the actual displays themselves are virtually identical.
It may actually be cheaper to make one grade of display and differentiate using the controls.
This may seem like a good thing, but it also usually enables a "vibrant" postprocessing picture mode, motion smoothing, and maximum brightness so the display looks good in a well lit big box store. Unless your viewing environment is similar (or you don't care so much) that's probably not what you want.
I can imagine that there would be a potential to generate interpolated frames that intelligently make fast-moving scenes more understandable while leaving slow-moving scenes more or less at their intended 24 FPS.
Many action movies, especially with close hand-to-hand combat in tight spaces, are difficult to understand visually because 24 FPS just doesn't quite catch the movements.
I sort of don't like it(Old man shakes fist at sky "I want my frames to be real") but they are getting amazing results.
I want a TV for her that will power-on directly to YouTube-TV, and that's it, nothing else, no notifications, nothing.
The main downside is that there is currently no great "ten foot" UI for this use case on Linux. But the KDE Bigscreen project is being revived and could offer a definitive solution for free TVs.
I’d go with a basic monitor and factor out the “smart TV” into whatever device you prefer – Apple TV, Chromecast, Firestick, any SBC with Kodi loaded onto it… an Xbox… why couple the smart features to the display?
Worth clarifying that when I was a kid "TV-size" meant anything above 13", but the times have changed considerably. :)
I managed to grab a 55" 8K LG before 8K went out of fashion. I run it at 4k120 for games and 8k60 with doubling for productivity.
I've never had a better monitor and if one should exist it's not available in any store I know about. Monitors costing 2-3x as much as this TV did back then are worse. When it dies I will have to downgrade. :-/
I'd say yes and no - they are becoming a thing - again. And you're right that they are prohibitively expensive this time.
Some 5 years ago 8K tv's were heavily marketed and displayed in many electronics stores, but consumers apparently didn't bite - basically no 8K content available and for "normal" TV use you can barely see a difference between 4k and 8k anyway.
So these TV's were very cheap for a short while before they basically disappeared.
And they make for great PC monitors. At normal working distance from a monitor you definitely notice the difference between 4k and 8k.
The screen area is basically the same as a 2x2 grid of 27" 4k monitors, but in one screen. For productivity work it's absolutely glorious, text is super-crisp.
I think there is a market structure problem that blocks progress here. Most people who work all day at a monitor would love to have such a screen, but the people paying for screens buy what the producers are selling, based on price.
So we end up with dual or triple small-monitor setups even in the wealthiest companies. If a few of the FAAMGs started asking for a 50" 8K maybe something would happen, but it hasn't yet. :(
So you find that TV panels are much larger at lower price points than computer monitors because they serve different purposes.
Those are not exactly hackable, are they?
The Arm SoC is the real interesting part here as it also has WiFi and Blue Tooth interface, Ethernet, and USB port(s). They're like a giant black box Raspberry Pi. If we could get our hands on the SoC datasheet then its possible we could flash that SoC to run whatever OS we want and actually have a Smart TV instead of a spyware and malware vector. Though I am sure no TV maker would ever let the plebs disable their money making spying and data exfiltration schemes.
Most LED backlights are wired in such a way that when one LED fails it bricks a significant portion of the panel backlight. You'll knock out entire rows or huge portions of neighbor backlight LEDs when one fails. Basically it's a cheap way to ensure a whole row of LEDs are the same brightness but the tradeoff is one LED fails and it looks like 5% of your screen went dark.
It seems like a good beginner-intermediate thing that'd be approachable to learn with a basic multimeter and beginner level soldering skills.
Surely it's more straightforward to buy a SBC yourself and plug that into your TV? Even if you could flash it, dealing with random SoC/hardware seems not worth the hassle compared to shelling out $50-200 for a SBC that you picked and can be carried between TVs? Flashing third party ROMs like lineageos makes sense because there's no real alternative for smartphone hardware, but the same isn't true for smart TVs.
Of course it is. Though my point is we already have the hardware in the TV and it would be awesome to actually use it the way we want to use it. Also, I have two dumb TV's, each with a small PC hooked to it and they haven't moved in years.
Honestly all the onboard TV OS stuff I have interacted with in the last decade has been more or less terrible and I wouldn't even consider it when buying a TV especially one that is just going to be a screen. All of the recent installs Ive dealt with (family and friend support) has revealed a ton of pay-to-play features (Samsung frame tv's cough cough). I applaud you for wanting something neat but I cant say Ive come across anything Ive ever actually wanted to use beyond "select input -> HDMI1"...
Just never, ever connect the TV to the internet. Connect up an Nvidia shield, or a mini-PC/raspberry pi configured with whatever apps you desire, hidden behind a pi-hole. Connect a steam deck if gaming/linux desktop usage is your thing. I only touch the TV remote to switch on the TV, and even that could be automatable with home assistant+CEC if that's of interest.
You generally don't want a smart tv you can hack. You want a decent computer you own sending signal through the external inputs.
The SBC in the TV is, hands down across basically every "smart" TV I've interacted with, a cheap piece of crap (even well into the "expensive" brands and models).
Manufacturers stick the absolute cheapest garbage in there that can output the advertised resolution during playback without stuttering.
So you can spend hours/days/week wrestling this cheap, underpowered board back from the manufacturer... or you can just side-step it entirely and spend much less time and effort sticking a decent computer you own behind the tv.
All my TVs have an Apple TV on them and that's all that is used (aside from a game console here and there). I pretty much never need to interact with the TV OS. Is there a Netflix app on my TV? Probably, I'll never know, I've never even launched the app store.
It's been a bit since I've done this (I'm not watching live TV anymore), but something like HDHomeRun worked fine.
It basically pairs an antenna with a small computer to convert to network traffic, then gives you an app on your streaming device to play it back.
You do need to be able to run the vendor's app, and you'll get stuck with that UI for live tv (So yeah - totally agree that you're compromising the UX). But still no reliance on the "smarts" built into the tv.
https://www.rtings.com/tv/reviews/best/by-usage/pc-monitor
but that might be a bit too basic. because its basically a monitor with 100% duty cycle and a PC.
- Samsung Tizen is sluggish. - LG webOS is fiddly and don't feel it gets enough attention and care from LG. - Other brand is just slow. - On hackability, Android is far easier to handle than any other brand.
Mind you Sony has a few line and some run Android, some run something else.
"Alexa, turn on Living Room TV" -> HA -> Harmony -> IR Blaster
I think through the Apple TV integration I can control them even further but I greatly prefer just using the Harmony remote. I'm not looking forward to the day when those stop working completely and I have to evaluate other options. Every year or so I look around but nothing beats my old Harmony remote (with a coin battery that lasts so long I've lost track, easily over a year) and the Harmony Hub (which actually sends out the signals).
Otherwise it will run out of updates fast, services will stop working and only way to fix that is to buy.. a separate device.
This also let's you make search easier as you can just look at the panel itself when comparing.
I think I have a framework-like TV. It's a high end TV set to store mode which has no smartOS annoyances. From there, I have expansion modules (they connect via HDMI) like a HDFury Vertex with CFW, Nvidia Shield, PS5, etc.
Decoupling the TV from the OS has helped a ton with longetivity
What more are you looking for?
Then I can plug that into any dumb TV/beamer I find.
My dream is to hack that SoC to boot whatever OS. Though good luck getting the datasheets...
By the end of the day it's just an android device with an HDMI out, and that's exactly what I wanted.
I didn't write the code but it seemed like you can get a development account from Tizen and write your own apps.
To be clear, Tizen is not a brand of TV, it's the name of the OS. It's fairly common on various no-name hardware brand, check it out.
RTINGS actually tracks this, with most being comparable to monitors at the same refresh rate, while in game mode (around 10x faster than non-game mode). [1]
4k@120Hz with VRR is even available in < $1k TVs these days!
And, for audio latency, unless you're using the built in speakers, it's fairly trivial to make the video and audio paths independent.
[1] https://www.rtings.com/tv/tests/inputs/input-lag