Showing posts with label AI. Show all posts
Showing posts with label AI. Show all posts

Wednesday, August 07, 2024

Consumer Alerts: Netflix Plan Change; Costco Tandoori Wrap

 The email the other day told me that the plan I'm on with Netflix is being discontinued.  They announced that I would save 40%.  But then in the smaller print it said, I'd get ads at that lower price.  

A few short ads.  Few is pretty vague.  So is short.  Anything over 5 seconds is too long for me.  And in the middle of a movie?  That's sacrilege. 

"Designed not to interrupt you during a scene" - So does this mean at the end of the scene, but in the middle of the movie, they will interrupt?  Totally unacceptable.  

My current bill is $11.99 per month.  That's up from $9.99 a month not that long ago. [I looked on line.  Seems they announced the increase in June 2023 and it went into effect in October 2023, best as I can tell.]

That's less than a year ago.  Can we expect annual bumps from now on? 

Compared to going to the theater, Netflix is a great deal.  So great that we find we are spending way too much time watching.  At least we limit to after dinner, generally not starting until 8:30 or 9:30.  And trying to end around 11pm.

But as I think about it, we lose a lot of reading time and a lot of time when we used to talk to each other.  And I have noticed that blogging gets cut back by Netflix.  

So I replied that we did not want ads and were ready to cut loose from Netflix. 

I got another email - My current plan would end September 30.  I replied that our Netflix addiction would end September 30.  Of course, the emails from Netflix were not ones you could reply to and I got notices that they weren't delivered.  

Prices go up because people are willing to suck it up and keep paying.  In this case I need to figure out how to let Netflix know, I don't plan to pay after September.  


Meanwhile, I had to go to Costco to get a repair on one of my hearing aids - which they did and it worked.  But as I gathered some fruits and veggies and fresh salmon, I saw some Tandoori Chicken Wraps.  Looked good and they had a $2 off sign, so I thought we could try them.  

Today, when I looked to see if and how to heat them up, I saw there were no directions.  Just the longest list of ingredients I can remember ever seeing.  



From what I could tell checking a Reddit discussion, you were supposed to eat them cold.  We did.  

Boring!!  (Does it make sense to put exclamation points after boring?  Probably not)  Despite all the ingredients it didn't really taste like anything.  It was mushy. Avoid.  

Back To Netflix 

And if you have Netflix, and you're also unhappy about this, you can go to manage your account and play around until you find the contact button.  Then you have a choice of phone or chat.  

I chose chat, because I can make screenshots of what was said, but I'm pretty sure it was a bot responding.  When, at one point, I asked how many siblings do you have and where are you in the birth order, the response was 

"I'd be happy to answer Netflix-related questions today. Do you have any questions about your account or our service?"

At the end when they asked if I had more questions, I said that they hadn't answered whether they were a bot or human and the answer was "I am a human."  Must be depressing having people think you aren't human all day - assuming that was true.  

Maybe we need to have legislation requiring customers be told whether the chat or voice they are talking to is a human or not.  With consequences if they lie.   

I'd suggest people go into your accounts and tell them you are going to cancel your accounts at the end of September (or whenever your basic service ends).  If enough people do that, perhaps they will reconsider.  And you can always rejoin later if you have severe withdrawal symptoms.  

 

Monday, June 10, 2024

AI Scraping My Blog?

My Stat-Counter account has been showing this frequent Hong Kong visitor:


Total Sessions usually records how many times the computer has visited, but it says only 1, even though there are five total hits on this one page (of 20 hits) on the Stat Counter report.  It's been showing up frequently for weeks now.
I know, I said five, but they are scattered.  The one on top is one.  Here are three more and there was one more.  


I've had this sort of thing before, but it's been awhile.  In the past, the assumption was they were scraping content.  Now, I'm wondering if it isn't an AI bot gathering stuff for training.  If so, what should I do and how?  From Duda.

"How to Block AI Crawlers from Crawling your Site

Some site owners are choosing to block AI crawlers, such as ChatGPT and Bard from crawling their site in order to prevent it from learning from or using their website content. You can block these AI user-agents in a similar manner as you would block Google crawlers; by replacing the default robots.txt file with a new file that specifies disallow rules for specific AI user-agents."

When I first started blogging, I spent a lot of time learning about (and blogging about) technical aspects of blogging - how to:find out if anyone is reading the blog; to embed photos and videos; how to change the format; how to add an email address; etc.  

Now AI is raising other issues.  Such as how to block AI crawlers from using your site to train its bots.  

This is not what I want to spend my time on.  First the internet is telling me I have to block each crawler separately by adding code to the robot.txt file.  


Should You Block AI Tools From Accessing Your Website?

Unfortunately, there’s no simple way to block all AI bots from accessing your website, and manually blocking each individual bot is almost impossible. Even if you keep up with the latest AI bots roaming the web, there’s no guarantee they’ll all adhere to the commands in your robots.txt file. 

 From Google Search Central:

"You can control which files crawlers may access on your site with a robots.txt file.

A robots.txt file lives at the root of your site. So, for site www.example.com, the robots.txt file lives at www.example.com/robots.txt. robots.txt is a plain text file that follows the Robots Exclusion Standard. A robots.txt file consists of one or more rules. Each rule blocks or allows access for all or a specific crawler to a specified file path on the domain or subdomain where the robots.txt file is hosted. Unless you specify otherwise in your robots.txt file, all files are implicitly allowed for crawling."


That means I have to find the robots.text file and add stuff and hope I do it just right so I don't screw something else up.  But this site also warns:

"If you use a site hosting service, such as Wix or Blogger [That's me], you might not need to (or be able to) edit your robots.txt file directly. Instead, your provider might expose a search settings page or some other mechanism to tell search engines whether or not to crawl your page."

Of course I don't want to block search engines for browsers or only subscribers will ever see my posts.  

So I'm asking myself, is this worth the time it's going to take to figure this out.  Well, someone else asked that too.

"The real question here is whether the results are worth the effort, and the short answer is (almost certainly) no."

Here's another one saying the same thing:

"At the end of the day blocking ChatGPT and other generative AI crawlers is really a matter of choice. Depending on your website’s purpose and/or your business model it may make sense to. But in my opinion the vast majority of sites have nothing to fear from allowing AI crawlers to crawl their site."

For now, I want to agree with this advice.  But then I start thinking that this was written by an AI firm that wants to steal your content.   

And I don't even know if that Hong Kong visitor is scraping material for some AI enterprise.  Maybe it's just stealing content.  

Like your car, your house, your garden, your teeth, everything needs some maintenance to keep it functioning.  Clearly my phone and computer do, and this blog does as well, though I've avoided that for some time on the blog.  

I'm now officially putting myself on notice to pay more attention to AI.