Berkeley CSUA MOTD:Entry 28205
Berkeley CSUA MOTD
 
WIKI | FAQ | Tech FAQ
http://csua.com/feed/
2025/05/24 [General] UID:1000 Activity:popular
5/24    

2003/4/23-24 [Recreation/Media] UID:28205 Activity:high
4/23    An analysis of Netflix's DVD allocation system
        http://dvd-rent-test.dreamhost.com
        \_ Yeah like everyone else hadn't figured it out 2 years ago that
           newer customers get priority.
           \_ I found this useful. I was considering getting Netflix but
              I will stick with the local video store after reading the
              article since our use would be very high.
              \_ Uhm, yeah, two years ago, etc, etc.
              \_ No, you don't understand.  dim said you'd be a goober if
                 you paid attention to that site.  Your only hope is to go
                 strong with Netflix if you want to be redeemed in the eye
                 of dim.
           \_ You're still a customer after 2 years?
              \_ Never was a customer but even so I still knew two years ago
                 that netflix was heavily weighted to newer customers.
           \_ there's more of interest there than just this.  It was a
              very interesting site, and explains a lot about why
              I had so many movies on long waits.  (i rented MANY movies).
              to TOP, thanks.
              \_ No there isn't.  The fact is new = get more films faster,
                 old = get films slower.  Two years, etc, etc, etc.
                 \_ Your comment implies that you didn't actually read the
                    article.
                    \_ reading is unnecessary.  the dim has already spoken.
                    \_ what's to read?  it's a fucking dotcom scam job that'll
                       be gone when the vc runs out and everyone catches on to
                       the new-customers-get-movies-first scam.  I worked for
                       a few dotcoms, a few of which had a better scam than
                       netflix yet they're gone now.  there can't possibly be
                       anything interesting in an article on some dotcom.
              \_ The guy who did that site is a goober and by extension
                 so are you. --dim
                  \_ awww, shucks.
                 \_ We should all go home and hang ourselves now.  The dim
                    has spoken.
                    \_ I'm already hung.
                       \_ Like a horse, a bear, a gorilla or a mouse?
                          \_ like a tree.  "grab the lowest branch and
                             climb on up..."
                             \_ whoa baby!  got pics?!
                                \_ http://journalism.berkeley.edu/graphics/campanile.jpg
2025/05/24 [General] UID:1000 Activity:popular
5/24    

You may also be interested in these entries...
2013/1/26-2/19 [Politics, Recreation/Media] UID:54590 Activity:nil
1/26    Wozniak says the the Steve Jobs movie clip is historically inaccurate,
        that Jobs was not so much a visionaire that Jobs claimed to be, as Jobs
        was just looking to make a few quick bucks. Why should we trust
        Woz's words over Jobs'?
        \_ Seriously? Read both the Steve Jobs biography and Woz's autobiography.
           Nobody contests that Jobs was a scumbag. Also: the lounge in Soda Hall.
	...
2012/12/6-18 [Recreation/Dating, Recreation/Media] UID:54549 Activity:nil
12/6    Lesson learned: don't talk about Monty Python on a date. Women just
        don't seem to get it.
        \_ You are dating the wrong women (for you) then. My sister-in-law
           loves it and yet I don't find it all that funny. It's not a
           gender thing.
           \_ is she a nerd? does she laugh funny? is she actually decent looking?
	...
2012/10/17-12/4 [Recreation/Media] UID:54504 Activity:nil
10/17   Cloud Atlas Shrugged.
        \_ How is this movie? Did you see it?
           \_ Cloud Atlas is a movie. Atlas Shrugged is a movie.
              Cloud Atlas Shrugged is a made-up name.
              \_ How was Cloud Atlas?
	...
2012/7/3-8/19 [Recreation/Media] UID:54430 Activity:nil
7/3     Just found a great new movie on cable last night:
        "Blame it on Rio"
        http://www.imdb.com/title/tt0086973
        In it, a middle aged guy gets seduced by his friends
        teen daughter:
        http://www.imdb.com/name/nm0001400
	...
Cache (8192 bytes)
dvd-rent-test.dreamhost.com
Test Overview * 10 The Results * 11 Why Is Netflix Doing This? As your cost-per-disc to Netflix increases due to more frequent rentals, you will have less of a chance of receiving a low-availability movie compared to an individual who has a lower cost to Netflix. Put it another way, if customer X and Y are both in the 5 disc out plan and X rented 14 discs vs. Y's 11 discs in the previous month, Y would have priority over X when they are both competing for the same movie. A side effect of this is that trial and new customers will have far fewer problems getting movies, especially new releases, versus the majority of established customers. Essentially these new and low cost customers can "cut in line" ahead of other customers. While Netflix initially did not admit the practice, Netflix public relations and some of their customer service representatives are now acknowledging that customers who rent fewer movies receive priority. Introduction Around January 2003 I started seeing "wait times" on numerous movies in my queue skyrocket. In particular some movies which were recent, mainstream, and well advertised releases were impossible to get. It just did not make sense that some of the popular movies in my queue were so hard to get. The first thing I did was create a new trial account for my wife. I added three movies that I was having difficulty getting in my account. The following table shows the differences in availability between the two accounts: Movie Availability Original Account ("A") New Trial Account ("B") About a Boy Short Wait Now Sweet Home Alabama Very Long Wait Now Trees Lounge Long Wait Now The snapshot was taken when the trial account was opened on the evening of 3/5/2003. Both accounts use Colleyville, TX as their service center. The difference in availabilities translates directly into your ability to receive a movie. About a Boy and Sweet Home Alabama were shipped to account "B" on 3/6. But note that I had only kept these three movies in the "B" queue during the early days of my testing. I did test purging my "A" queue down to a few movies and this had no effect on availability; Does Netflix sit on your queue or give you one of these movies anyway? As an aside, Serving Sara and Undercover Brother are, in my humble opinion, terrible movies and I'm just a little embarrassed they were in my queue above. Not that Sweet Home Alabama is Oscar worthy, but who can't smile at Reese Witherspoon? My initial thought was that new accounts are given priority, to get you hooked on the service and make your trial and perhaps first paid month a pleasant experience. And in a way they are, but I think through a different route than simply account age. More on what I think is going on behind the scenes at Netflix later. By training I am a software engineer who has also dabbled in product management, software architecture, and low-level IT management. It has been a very long time since my basic statistics courses in college. My intention is not to create the perfect scientific study of this. My goal was simply to understand and document some very odd behavior in the Netflix system with as little effort as possible. Automation Overview To investigate this anomaly further I set about creating some Perl scripts to: * Add (and later delete) a set of test movies to other account queues * Extract queue listings * Extract billing history * Extract 90 day rental history * Compare and summarize the differences between each test account I am not making the scripts available. I imported the comparison results into Excel and further summarized it and charted it. You could launch this workbook to view the charts directly within Excel. An Adobe Acrobat version of the Excel workbook is available 18 here. Three of these movies eventually shipped to account "A" or "B" during the period I was collecting data, so in the end there were only 43 movies that were in common to all five accounts during the entire test period. Keep in mind that most movies in my queue had a "Now" availability. The following table summarizes the state of my queue before testing. Availability Number of Movies Percentage of Total Now 214 73% Short Wait 48 16% Long Wait 19 6% Very Long Wait 12 4% 293 This report does not attempt to determine what percentage or class of movies Netflix has availability problems for. It focuses solely on how movies are allocated when there are availability problems. The five test accounts used three different Netflix service centers. A service center is the place where you are most likely to receive a DVD from and where you return your DVDs to. Two of the accounts were in the 5 movie out plan ("A" and "C") while the rest were in the 3 movie out plan ("B", "D", and "E"). I eventually used the UN*X cron scheduler to collect queue data every evening at 8:00 PM CT. Netflix batch processing seemed to be finished by this time, with availabilities changing Sunday through Thursday evenings. Note that in a few cases these pre-cron batches were run after midnight and then another batch was run in the evening on the same date. This does not cause any problems but I wanted to point this out in case anyone inspects the raw data closely. The Results To aid in comparing the availability of movies in one queue to another, I created an "availability score" for each queue. I assigned a weighting to the different levels of availability Availability Weighting Now 0 Short Wait 1 Long Wait 2 Very Long Wait 3 A queue was then assigned an overall "availability score" by adding up the weighting of each movie. So lower availability scores mean more movies are available, higher scores mean more movies are unavailable. The following table illustrates the differences between the five test accounts for one sample day (March 14th). The availability score is shown in parenthesis besides the account ID. Movie A (46) B (12) C (27) D (27) E A Walk on the Moon Short Now Now Now Now Alfie Long Short Long Long Now Alfred Hitchcock Collection #3 Long Now Now Now Now Better Than Chocolate Long Short Long Long Now Boiler Room Short Short Short Short Now Bonnie and Clyde Now Now Now Now Now Brazil Short Now Short Short Now Brief Encounter Short Short Short Short Short Citizen X Long Now Now Now Now Dancer in the Dark Short Now Now Now Now Dead Again Now Now Now Now Now Double Indemnity Short Now Short Short Now Drugstore Cowboy Short Now Short Short Now Emma (Miniseries) Short Now Now Now Now Foxy Brown Long Now Long Long Now Ghost Dog: Way of Samurai Short Now Now Now Now Go Tigers! In the example above "C" & "D" have the exact same availability status for each movie, yet one is in Houston, TX and the other is in Lansing, MI. As discussed earlier, at first I thought that account age might be used to determine availability. However, account "E" was not really using their account. The account was 13 months old and had an extremely low (good) availability score. My brother had set up my Mom with an account and she was just sitting on movies. Needless to say I have since cancelled the account: those were some very expensive rentals! I believe it is the number of movies you rented during your last billing cycle, or possibly last 2 or 3 billing cycle. The following chart tracks availability scores during the entire test period for accounts "A" and "B". As I started testing I stopped returning movies from account "A" and started using account "B" exclusively. Thus over time A's number of rentals decreased and B's increased. The chart dramatically shows how on the first day of the new billing period for "A" (March 31st), the availability score dropped from an average of 42 for the previous billing period to an average of 25 for the new billing period. Likewise, when a new billing period kicked in for "B" (April 15th), it's availability score jumped from 13 to 40. Each billing cycle is plotted separately and has a different symbol for a data point. For example, "A1" is in one billing cycle while "A2" is in another. For the test account I created for my wife, I showed the 10 day free trial as B1, the first paid month as B2. Accounts in the 3 movie plan are shown in orange while those in the 5 movie plan ar...