Viral Tweets From Steve Wozniak and Ruby on Rails Creator Spur Investigation Into Apple Credit Card (bbc.com) 159
An anonymous reader quotes the BBC:
A US financial regulator has opened an investigation into claims Apple's credit card offered different credit limits for men and women. It follows complaints -- including from Apple's co-founder Steve Wozniak -- that algorithms used to set limits might be inherently biased against women.
New York's Department of Financial Services has contacted Goldman Sachs, which runs the Apple Card. Any discrimination, intentional or not, "violates New York law", the Department of Financial Services said. The Bloomberg news agency reported on Saturday that tech entrepreneur David Heinemeier Hansson had complained that the Apple Card gave him 20 times the credit limit that his wife got. In a tweet, Mr Hansson said the disparity was despite his wife having a better credit score. Later, Mr Wozniak, who founded Apple with Steve Jobs, tweeted that the same thing happened to him and his wife despite their having no separate bank accounts or separate assets. Banks and other lenders are increasingly using machine-learning technology to cut costs and boost loan applications. But Mr Hansson, creator of the programming tool Ruby on Rails, said it highlights how algorithms, not just people, can discriminate.
"Apple and Goldman Sachs have both accepted that they have no control over the product they sell," Hansson posted angrily on Twitter. "THE ALGORITHM is in charge now!
"All humans can do is apologize on its behalf, and pray that it has mercy on the next potential victims."
New York's Department of Financial Services has contacted Goldman Sachs, which runs the Apple Card. Any discrimination, intentional or not, "violates New York law", the Department of Financial Services said. The Bloomberg news agency reported on Saturday that tech entrepreneur David Heinemeier Hansson had complained that the Apple Card gave him 20 times the credit limit that his wife got. In a tweet, Mr Hansson said the disparity was despite his wife having a better credit score. Later, Mr Wozniak, who founded Apple with Steve Jobs, tweeted that the same thing happened to him and his wife despite their having no separate bank accounts or separate assets. Banks and other lenders are increasingly using machine-learning technology to cut costs and boost loan applications. But Mr Hansson, creator of the programming tool Ruby on Rails, said it highlights how algorithms, not just people, can discriminate.
"Apple and Goldman Sachs have both accepted that they have no control over the product they sell," Hansson posted angrily on Twitter. "THE ALGORITHM is in charge now!
"All humans can do is apologize on its behalf, and pray that it has mercy on the next potential victims."
Funny shit (Score:2)
It would be funny shit If GS just cut their credit limit to match that of their wives and said, sorry about that, we fixed it.
Re: (Score:2)
Re: (Score:2)
Is GS not accepting married couples on the same account? That's how a rich guy can share money with his poor wife.
Re:Funny shit (Score:5, Insightful)
Except in Woz's case, his wife has exactly the same credit score as Woz. The algorithm isn't looking and saying "Steve is the rich buy in the family". As far as the algorithm knows, these are two separate and unrelated people. Instead something in the AI has decided that being male is enough to grant a larger credit limit, despite the credit evidence that both are equal.
My guess here is that the bank wanted the AI to avoid having human influenced bias, and yet the AI looked at the historical record and recreated the bias inadvertently (ie, it discovered a very large correlation in the past between being a woman and being paid less and being less likely to have a job).
Re:Funny shit (Score:4, Insightful)
Personal credit evidence, but that's not necessarily a good predictor for profits. You can almost certainly get better financial results using demographic data. It's in my opinion probable demographic data shows men and women to not be equal regardless of identical credit scores ... we prima facie are not equal in a myriad of ways to begin with after all.
For instance it's possible that men use a large amount of credit in a revolving way to generate lots of profit for a credit card company ... which can then massively offset delinquency risks. If women on average mostly run up lots of debt shortly before personal bankruptcy (a tail risk hard to catch in personal credit score) that's going to hurt their credit limits with a Bayesian optimizer which can infer their gender.
Bayesian mathematics is very sexist and racist.
Re: (Score:3)
Re: (Score:2)
Woz has a job, does his wife?
Is Woz's wife's name on their home title, or is it just Woz?
Woz has a history of earing millions of dollars year in his career, does his wife?
Simply having the same credit score, just one of the literally hundreds of factors considered when a creditor assess someone's credit limit, doesn't guarantee the same credit limit.
Re: (Score:2)
Re: (Score:2)
Woz and his wife both share the income from investments and neither are getting a salary. They should be equal, no?
Re: (Score:2)
Re: (Score:2)
You don't think woz has income? Guh?
You don't words good?
Darinbob pointed out that the Wozniaks get the majority of their income together.
Their individual income doesn't come into it because it's the same for both of them.
Re: (Score:3)
Darinbob pointed out that the Wozniaks get the majority of their income together.
Their individual income doesn't come into it because it's the same for both of them.
If neither Woz nor Mrs. Woz earn a paycheck, and they rely on Woz's earnings from his time at Apple, then it isn't "their income together", it's his and he shares it.
Consider two credit applications - one is employed and gets dividend checks worth millions of dollars a year, the other is unemployed and lives with someone that gets dividend checks worth millions of dollars a year - do they both deserve the same credit limit?
Re: (Score:2)
Woz's stock dividend checks only have his name on them.
The fact that once Woz cashes the dividend check he puts it in a joint bank account with his wife is interesting, but that doesn't make it her income also.
Woz works as a school teacher, doesn't that generate income?
Does his wife work or not?
Re: Funny shit (Score:2)
"Your credit limit is based on a number of magic black box algorithms, the actual operating details of which are completely unknown to the public."
FTFY
Re:Funny shit (Score:4, Interesting)
There was another guy, who was advised to check his wife's credit score to find out why she had been offered about 1/20 the credit limit that he got, despite all their assets, mortgage and tax submissions being joint. When he checked, her credit score was better than his. Apple's defense then changed to - we are following standard industry practices. If that is the case, the investigation needs to go much wider than Apple/GS.
Re: Funny shit (Score:2)
Your joint accounts and tax returns, and community blah blah state you live in have nothing to do with this, are they on your spouse's credit report?
If it's not on your credit app, and it's not in their credit report, you don't have a checking account with GS and direct deposit, then that's all they know about you, and they certainly don't give a shit about your joint tax returns. Omg there is so much stupid on display in that twitter thread.
Two people can have the same credit score and totally different c
Re: (Score:2)
Do we know for sure if the wife actually does have the same bureau credit data and score? Credit bureau data does not have information such as assets or income or gender.
But banks can and will use additional information about income and assets to grant credit. Is Woz a customer of Goldman in wealth management or brokerage? Or some other bank/broker who has credit sharing agr
Re: (Score:2)
Except in Woz's case, his wife has exactly the same credit score as Woz. The algorithm isn't looking and saying "Steve is the rich buy in the family". As far as the algorithm knows, these are two separate and unrelated people.
Because credit-limit decisions are based 100% on a person's credit score?
Instead something in the AI has decided that being male is enough to grant a larger credit limit, despite the credit evidence that both are equal.
Horseshit, the AI decided nothing, the software that determined credit limits likely considered various factors like income, bank account balances, payment history, etc.
My guess here is that the bank wanted the AI to avoid having human influenced bias, and yet the AI looked at the historical record and recreated the bias inadvertently (ie, it discovered a very large correlation in the past between being a woman and being paid less and being less likely to have a job).
Stop guessing, and stop calling "if...then" statements as "AI" - I'll bet you the credit limit engine is not given an applicant's gender - it's irrelevant. Now, I bet the decision engine did get sufficient information to realize that someone with the name and SS# o
Re: (Score:2)
Because credit-limit decisions are based 100% on a person's credit score?
While my experience with the credit card industry is in the UK, even in the US I'm going to say that's utter bullshit.
A credit score is a risk measure. It isn't an affordability measure. Credit limits must take into account affordability too.
So immediately we can see that more than just the credit score has to factor into a credit limit. No matter how bold your text is.
Re: (Score:2)
The issue here is because these are algorithms and Goldman Sachs and Apple claim that these decisions are not decided by people. Do you think the algorithm has a conditional clause for rich guys or minor internet celebrities?
Re: (Score:2)
Do you think the algorithm has a conditional clause for rich guys or minor internet celebrities?
It's better than that, they think there is an "IF/THEN" statement that considers an applicant's gender and treats women differently for checking the gender check box "female".
If Woz wants to test his hypothesis, why not submit an identical application as his wife's, but check "male" instead and see what comes out?
Re: Funny shit (Score:2)
Did she?
The sex of a person is not considered with credit. (Score:2)
Re:The sex of a person is not considered with cred (Score:5, Interesting)
What makes you think that's all a bank looks at? These days they're likely to look at your social media history, driving record, Netflix viewing habits, and anything else they can easily access to see what correlates well with good investments they've made in the past.
The obvious problem of course, is that the past, and even the present, is pretty heavily biased - and so any algorithm developed based on what was done in the past is almost certainly going to reflect those biases and, if unchallenged, make them permanent going forward. It seems to me that anyone who chooses to trust such an algorithm should have a legal obligation to understand that, and should be held every bit as accountable for that bias as if they were making the biased decisions personally.
Re: (Score:2)
Should it be the task of private citizens to take financial losses to remove biases in reality though? In my opinion only if their customers demand it, not because government forcing them to be blind to reality ... that just creates a whole lot of dishonesty and loophole seeking behaviour.
If some group needs affirmative action to (hopefully) correct a bias and customers won't reward companies for shouldering that cost ... then let government foot the bill.
Explicit subsidies, not government imposed blinders.
Re: (Score:3)
If it's a genuine risk assessment? No. But the evidence would seem to suggest it's actually a presumed risk assessment because the algroithm is mimicking the unjustified biases of past human risk assessors. That's just digitized discrimination, and using such a tool should pull down just as much social retribution as making those discriminatory decisions yourself.
Re: (Score:2)
But then you get to argue over who's statistics are right ... and apart from the market there's only the academia to argue.
You can't really run experiments to truly prove financially unjustified bias for something like this ... not without handing out hundreds of millions worth of credit to run it. That's what the market does, run those experiments. With far better data than academics.
So I don't think giving the academia a legal voice to declare market based algorithms discriminatory is a good idea ... they
Re: (Score:2)
What? No (Score:3)
"All humans can do is apologize on its behalf, and pray that it has mercy on the next potential victims."
What? No! You test your models, and to avoid this problem specifically, you test it with members of protected classes as inputs. Furthermore it's highly unlikely that there is deep learning going on here, how many data points to credit issuance can you have? (There may be machine learning, but not deep learning.)
Re: (Score:3)
how many data points to credit issuance can you have?
In the PBS Frontline special on AI that aired last week, they had a Chinese tech CEO describe his loan issuing app. He said that it takes into account 5000 personal features when it evaluates applications (which takes only a couple of seconds to execute). One thing that he mentioned had a strong correlation with creditworthiness was the battery charge level of the applicants' cell phones.
Perhaps the wives of these two guys had low cellphone batteries.
Re: What? No (Score:2)
Re: (Score:2)
Deep learning is characterized by depth, not width of input. "Deep" means that the model can efficiently capture lots of complicated interactions among the inputs.
I don't know about this particular algorithm, but there's a lot of interest in using deep learning for credit assessment because it can potentially make more sophisticated decisions than simpler shallow algorithms. "Your income is low, but your family has money and they've been generous with it in the past" kind of thing.
Re: (Score:2)
You try capturing 5000 data inputs about someone to sufficient depth and accuracy to make credit decisions.
A 640x480 image is one input.
Who gives a crap about the credit score? (Score:5, Insightful)
Re:Who gives a crap about the credit score? (Score:4, Interesting)
My credit score seems to be very high still, despite my refusing increases in credit limits and paying everything off promptly (and automatically). I think having the mortgage helps here. Indeed there is plenty of anecdotal evidence that occasionally not paying off the full amount on a credit card can cause your credit limit to increase.
When I was first in college, it was very difficult to apply for a credit card. You had to have a steady job, someone to co-sign, and so forth. Then sometime before I graduated the credit world flipped around, and everyone became pre-approved. Not having a job was no barrier to getting a credit card, and even if you have 5 cards maxed out you would still be offered more. The banks realized that bad credit risks were very profitable, and that in problem cases they could always just sell the debt off to a third party collection agency.
Re: (Score:2, Interesting)
After having zero debt (paid off house) I have watched mine slowly go down. Itâ(TM)s really just a sucker score, of how much creditors will make off of you.
Yup, it's a score that goes up based off both how much debt you are in, and less so how well you are paying it off.
Being debt free as you are is the second worst situation possible for a credit score, only bested by not having any debt now or ever.
I found this out the hard way, never having debt until buying a house.
A person that charged up thousands of dollars and defaulted on the debt would have a vastly higher credit score than I had, due just to the fact they actually had credit at some point.
People say
Re: (Score:3)
Yep, just paid off ALL my debts, including my home, and I keep getting told by older relatives "Watch out, now your credit score will go down". They've all gone through it, including my parents.
That's pretty sick to penalize people for a lifetime of financial responsibility. But that's our credit system. You're actually rewarded for more debt slavery.
Re: (Score:2)
That's pretty sick to penalize people for a lifetime of financial responsibility. But that's our credit system. You're actually rewarded for more debt slavery.
Yeah, that's pretty weird. The debt slavers have policies specifically designed to create more debt slaves. Doesn't add up.
Re: (Score:2)
Certainly not lenders. You might not know it, but "credit score" means nothing to a lender. Your "FICO Score" is just a number for you, the consumer. If you check your credit score among all those credit score sites and apps, each one will give you a different number - easily varying +/- 200 points or more.
It turns out credit agencies have a number that's only revealed to lenders. The credit rating agencies do not reveal this and lenders are prohibited from revealing it to consumers themselves. And the actu
Re: (Score:3)
Re: (Score:2)
Which is one reason to keep using a credit card.
Mine is paid off automatically each month, so I pay no interest, but maintain the track record of going into debt and paying it off.
Income counts... (Score:2)
Rich guy and non-working wife... that gives the rich guy the higher credit limit because he has income and she doesn't. Credit evaluators have to ask for this because the reporting companies don't track employment. So, 800 credit score to 700 credit score plus income, the income prevails!
Re: (Score:3)
Exactly. This whole thing is embarrassingly stupid. Reading his original tweet it's sad how many stupid people there are on twitter. If you want to complain, complain that they don't have a joint account or that he can't get her a card on his account.
He knows this as well, but saw an opportunity to get some attention and be seen as Great Feminist Ally online. What a fucking piece of shit and it's pathetic Goldman Sachs caved in to his bullshit too.
Re: (Score:2)
Mr Wozniak, who founded Apple with Steve Jobs, tweeted that the same thing happened to him and his wife despite their having no separate bank accounts or separate assets
The double negatives might make it harder to parse, but right there in the summary it says that Woz & his wife (whose name I do not know and am not going to look up) have joint accounts and assets. You don't even have to go to the tweet (though it says the same thing).
Re: Income counts... (Score:2)
Re: (Score:2)
For credit it is usually household income though
Re: (Score:2)
I was directly addressing your statement (emphasis mine):
Reading his original tweet it's sad how many stupid people there are on twitter. If you want to complain, complain that they don't have a joint account or that he can't get her a card on his account.
I simply pointed out that they in fact do have a joint account (and additionally joint assets), as stated in the original tweet.
Re: (Score:2)
Nobody complains about higher auto or life (Score:2, Offtopic)
Re:Nobody complains about higher auto or life (Score:4, Informative)
Insurance Rates for men or any of the other things men pay more for. I guess the nonsexist thing is to have women pay as much or less for every thing in life?
Except those rates are based on actuarial data that shows men are a higher risk for an accident and die younger than women. As a result, rates are adjusted for these differences in risk and return.
Re: (Score:2, Offtopic)
Re: (Score:2)
So it's also ok for this to apply to race/ethnicity?
If one racial group is predisposed to a shorter lifespan due to genetics, perhaps.
Here in America health insurance charges men the same rates as women because men are just as likely to get pregnant as women, and women are just as likely to get prostate cancer as men - it's the law, you can't charge women a different insurance rate than a man and vice-versa.
Consistency of thought (Score:2)
Re: (Score:3)
Goldman Sachs is bound by law not to disclose anything about his or his wife's application. They can't do anything but be vague.
I am under no such constraint. He makes a lot more money than she does. Instead of putting their income as joint, she put her own income. When you have a lower income, you get a lower credit limit.
This isn't a great cause for "algorithm haters" to rant and rave, the algorithm at hand here is something that's been in place since credit cards were first handed out.
This is the epitome
Re: (Score:2)
Yes, because random twitter weirdos jumping on the bandwagon are a wonderful source of cold hard facts. These people are full of shit, the bank looks at your credit history and your income to determine your credit limit. Guess which one he has that is much higher?
And now that piece of shit is tweeting all the news he's made so everyone can see what a Great Ally he is. This only encourages this type of sleezeball to try to pull fast ones like this in the future. I will be incredibly happy when he gets divor
A new poster inspired by the NRA... (Score:4, Interesting)
"Algorithms Don't Discriminate: People Do"
Since the algorithms didn't write themselves (yet?), the inherent biases and prejudices of the people who designed the algorithms were built right into them. Algorithms reflect their human creators, warts and all.
Re: (Score:2)
Algorithms also reflect the behaviors of different groups.
Without actually looking at the factors that are producing these results, you can't say there is any inappropriate bias in these credit limit decisions. Appropriate bias, by the way, would be things like bias in favor of higher income, stable employment, and so on.
Re: (Score:3, Insightful)
You are being suckered. The algorithm here is very, very simple and it's the same logic a human would apply. "When person X and person Y have a similar credit score but person X makes much more money than person Y, give person X a much higher credit limit."
This is just a bullshit excuse for luddites to rant and rave about 'algorithms'.
Re: (Score:2)
I bet you the credit limit calculation engine doesn't even consider gender, as in the data isn't passed to the engine - and you what else likely isn't fed into the calculation engine? The fact that her millionaire husband shares his wealth equally with his unemployed spouse... I don't see that on the credit card applications I fill out anyway.
Sex a proxy for other socio economic factors (Score:2)
"Algorithms Don't Discriminate: People Do"
Since the algorithms didn't write themselves (yet?), the inherent biases and prejudices of the people who designed the algorithms were built right into them. Algorithms reflect their human creators, warts and all.
That's the point of machine learning. The "algorithm" isn't designed by humans, rather its "designed" by patterns found in the data. If the data is accurate and complete then machine learning may extract unexpected and nonintuitive truths irregardless of our political and social beliefs.
Here's an example of incomplete: sex may be a proxy for some other set of socio economic factors. These factors may be the actual driver, sex merely highly correlated with them in today's society, and the machine learning
Re: (Score:2)
Why do bigots behave like bigots, is that what you're asking? There was SOME ancestral benefit to it, apparently, but like so many other anachronisms it has long since outlived its beneficial purpose, but bigots are slow to grasp the new reality. Anachronisms are stubborn persistent beasts, however, more resistant to change than any low-functioning autistic, even moreso when there's a residual cultural impetus to them.
Re: (Score:2)
It wasn't all of Apple's staff that was responsible for this bias, you cunt. However few it was, they don't benefit (much) from it now... it's just a bias they haven't unlearned and their algorithm reflected it, however unintentionally. I doubt there was any conscious intent behind it so no, they're not benefiting from it, especially not now, you bleeding yeast infection and useless eater.
Transparent agenda. (Score:5, Insightful)
This guy is a fucking disingenuous scumbag scurrying for attention and Ally points. What a fucking degenerate scumbag liar.
The reason he has a 20X credit rating is because he put a much larger number in for 'income' than she did. He ignores people who ask him about this because he's lying scum. Any adult of average IQ knows that your credit limit is based on both your credit score and your income.
If they don't have a joint account option, then point out that as the reason. It's not sexism, a woman making much more than her husband with a similar score would get a higher credit limit as well.
And of course the rabble on twitter eat this shit up, and scrurrying little scumbag NY politicians see a chance to look good to the rabble so they launch an 'investigation'.
And Apple/Goldman Sachs can't point out what they put as their income, so all they can do is say the 'algorithm' which brings out the clueless fucking tech haters ranting about algorithms. There's nothing confusing, arbitrary, or biased about "If you make more money, you get a higher credit limit" you fucking dolts.
Twitter is an absolute cancer.
Re: (Score:2)
Does anyone know if he listed a higher income and/or assets, or if they both listed their joint income. Its a rather important piece of information that is needed to make any sense of the discussion.
(I have no idea either way, just asking_)
Re: Transparent agenda. (Score:3)
Re: (Score:2)
Do you actually know or are you assuming based on the results?
I haven't applied for a credit card in ages, but doesn't it use family income?
Re: Transparent agenda. (Score:3)
What's family, like your son applying for credit and the bank saying well ok your dad is rich so let's party?
Bank doesn't know family, they know joint account holders, but the credit is issued in who's name, the person applying for it, not some joint family entity, so even IF the bank was to make that connection when opening the credit account both of the joint members would have to sign off on it. Whoever has the highest income should apply for the credit and then add their spouse as a joint member on the
Re: (Score:2)
Re: (Score:2)
If they want to be treated equally, apply jointly - if you insist on applying independently, expect some differences.
It's cute that they consider themselves equal in every way, including their credit scores, but the reality is there anr many more differences between Mr and Mrs Woz than what they checked in their "gender" box on their credit application.
Re: (Score:2)
A quick scan on available financial documents would likely show many, many more (and larger) payments being made to Woz than Mrs. Woz, no matter what they put on a credit application.
I'm certain as truly equal partners in their marriage, they each decided to apply for credit independently (why should she have to include her husband on HER credit application?), and in doing so isolated his income from her application, resulting in her getting a lower credit limit than her husband.
Re: (Score:2, Troll)
In the Twitter thread linked to the guy doesn't mention his wife's income, but a couple of other guys say their wives earn more and got a lower limit.
The guy in the story does mention that he is on a green card and one felony away from being deported.
So where does your claim that it's income based come from?
Re: (Score:2)
Re: (Score:2)
Generally arguments of the "it stands to reason that..." style without data are not very convincing.
Re: (Score:2)
(agreeing with you, not the person you are responding to) Good point, That does seem to suggest that the algorithm is biased unless anyone knows of some other reason.
I don't understand why people assume the algorithm can't be biased. I thought it was well understood that its easy for past bias to creep into training sets for ML, / AI systems. The *algorithm* isn't written in a biased way, its trained on biased data.
Re: (Score:2)
Trolled by facts again, poor snowflakes. I'm sorry if Mr. Shapiro was rude to you, but facts really don't care about your feelings.
The outrage is based on FAITH, not data... (Score:5, Insightful)
Our outrage is based on faith — not data or science. We take it as a given, that women are the same as men, on average.
When a sexless computer is telling us otherwise, we blame its owners and creators — treating them as various churches treat other blasphemers.
But, maybe, women actually are different? They do have different hormonal make-up from men for one, and hormones do impact [breastcanc...queror.com] how we think, what we learn, and the decisions we make. They are also weaker — far less muscle — which could also affect day-to-day decisions making them different from those of a man in similar circumstances.
"THE ALGORITHM" is just a clever statistics-processor. Nobody puts an "if (sex == FEMALE) rate /= 20" into it — that'd just be bad business...
And, finally, let's admit to ourselves: both celebrities outraged over women getting lower scores, would've just snickered and shrugged, had it been the other way around. Maybe, he'd be upset about getting 1/20th of the limit of his wife, but he wouldn't be calling it "a fucking sexist program".
Re: The outrage is based on FAITH, not data... (Score:2)
This is very simple, if the law says you cannot discriminate by gender, you cannot make an algorithm that uses gender as a variable and when it decides to make a decision based on gender - however right it is based in statistics - claim you have no control over it. You can't go around the law like that.
The guy's full of himself (Score:2)
Re: (Score:2)
He's a piece of shit just posting for outrage attention. I can't wait until his wife divorces him and he, being such a Great Ally, gives her not just 50% but at least 78.2% of his wealth or whatever made up percentage they claim women are underpaid compared to men.
There is no mysterious algorithm at play. I can explain it very simply. Person's Credit Limit is Based on Income and Credit Score. See how easy that is? His stated, verified income is much higher than hers so there we have it. No mysterious black
Weapons of Math Destruction ... (Score:4, Informative)
Here is a very relevant talk, by a data scientist, about how big data + algorithms increases inequality: titled Weapons of Math Destruction [youtube.com].
Credit scores that they show you are bullshit. (Score:2)
There is over 100 points difference between the "credit score" that my credit card company shows me and what Experian shows me.
715 (credit card) to 830 (Experian).
About 18 months ago, my credit card used to show a FICO score of over 850, and almost nothing has changed in my credit history.
They are bullshit.
Does either of these reflect what they show a bank when the bank makes a hard pull of your score?
Re: (Score:3)
They are bullshit.
Does either of these reflect what they show a bank when the bank makes a hard pull of your score?
Yes, because depending on the bank and what specific scores they used, both scores can be accurate and still differ wildly.
For instance, you cited your FICO score. Except there's no one single FICO score, there are multiple types and versions of FICO scores with their own score ranges, parameters, etc., and many of these are not exposed to consumers except in response to hard pulls. See: https://www.myfico.com/credit-education/credit-scores/fico-score-versions [myfico.com]. Presumably the credit card-focused scores have
Re: Credit scores that they show you are bullshit. (Score:2)
Sounds like a crufty mess.
Everyone knows our Social Credit Scores have effect similar to (and often backed by) Law. Perhaps the algorithms used to generate the Social Credit Scores should be public record and subject to democratic oversight.
But hey - who wants that old-fashioned "democracy" and "transparency" crap when we can have a nice, modern, privatised bureaucratic tyranny? Am I right, or am I right?
Re: (Score:2)
So What? (Score:3, Insightful)
Two of a kind... (Score:2)
Where else could you find chance concentrating in two separate, highly intuitive minds in total agreement that facts belie a truth about technology?
Amazing...
Re: (Score:2)
What's wrong with selling a pre-assembled kit?
Re: (Score:2)
Jobs was a thief and Woz a bumbling brain. They stole the Blue Box from the FOSS community, via Captain Crunch, and turned it into a product that they sold to college students for $100 each.
They stole the Blue Box from the FOSS community
How do you do that? How do you STEAL from the FOSS community?(BTW, the design of the Blue Box was not "taken" from Captain Crunch nor the imaginary, not-yet-invented FOSS community, it was a well-documented hack that involved with sending a single tone of a certain pitch over a pay phone to simulate certain actions. Building a box that emitted the proper tone was a trivial piece of electronics, Captain Crunch earned his name when he "discovered" that the whist
Re: Same for autonomous vehicles (Score:3)
Autonomous cars are already safer than people so your point there is moot. If you actually studied MLAs and neural networks, you'd know the difference between how these systems are trained (supervi
Re: (Score:2)
Notably lacking: (1) evidence (2) analysis
Unwanted conclusion may be justified by the data (Score:3)
Notably lacking: (1) evidence (2) analysis
The same can be said for the notion that the algorithms are discriminating. A "politically" or "socially" unwanted conclusion may be justified by the data. If we decided that a political or social belief outweighs an accurate data based conclusion that's fine, we do it all the time, but own that call to override, to set aside logic in a particular case to serve a higher moral purpose. Don't just "demonize" algorithms.
The whole point of machine analysis and data science is to discover truths that are coun
Re: Unwanted conclusion may be justified by the da (Score:2)
If you feed data with an inherent bias when training an algorithm that bias will transfer to the algorithm as well.
Re: (Score:2)
Re: (Score:3)
If you feed data with an inherent bias when training an algorithm that bias will transfer to the algorithm as well.
Yes, but at this moment there is a lack of evidence or analysis showing any such thing occurred and we are currently acting on a social belief.
Re: Unwanted conclusion may be justified by the da (Score:4, Insightful)
From TFA:
From a little upstream in this thread:
This is how accurate data can be used to create biased results. Females may indeed do worse with credit based on aggregate history, especially as they make less money than men, are more likely to be sinlge parents raising children on a single wage, etc. But what does that have to do with the ability of Steve Wozniak's wife to make payments on credit card debt? Basing a decision on the financial health of other members of a class rather than on the individual's financial health is the biased discrimination based on a protected class that is unlawful.
Re: (Score:3)
Notably lacking: (1) evidence (2) analysis
Exactly. If the model is trained using real-world data, and that real-world data is biased, the model will replicate the bias. This is seen over and over again; because the model is a black box, you can't teach it not to be biased.
They need to create giant sets of fake data that isn't biased and train the model against that. If they can.
Re: (Score:2)
It being true doesn't change the fact that the model is a fucking useless joke if it can't tell that a man and woman who are married should have an identical credit score, at least in a joint property state.
Yea, but X% of marriages end in divorce. Even if they have the same property, who has the better earning potential post settlement?
Re: (Score:2)
Wouldn't that be reflected in credit scores after they divorce?
From my limited experience with friends' divorces, they usually try to zero consumer debt from assets during the divorce settlements because dividing outstanding consumer debt is really difficult to assign (besides like car or significant durable goods). Plus it's generally good for people to start off divorced without worrying about $5k in credit card debt.
And I'd guess that where there's not enough assets to pay it off, they divide is based o
Re: Same for autonomous vehicles (Score:2)
Re: (Score:2)