Sunday, April 13, 2014

Google only acquired male parts of startup company; and more #siliconvalleyfail

Article in the New Yorker about a startup company with four men and one woman ("Amy") acquired by Google. Google elected to only hire the four people with their male bits flipped.

The four men were code monkeys engineers, and Amy was a UX and product designer, and co-founder who contributed tons of ideas. Apparently Google gave massive signing bonuses and salaries to the men, but did not hire her or compensate her during the company acquisition.

Put yourself in this position for just a second. You helped found a company, you contributed major ideas, you got it to the point where Google decides it's worth slurping up. But then:

Do you have any clue what that feels like? It's horrible. It's people saying: "I don't respect you because of how you were born." 

It's impossible to imagine this rejection if you are a majority member. Well, I can tell you - it hurts. A lot. Probably one of the hardest pains out there.

The worst part about me reading this article is that this week alone I heard stories about TWO amazing, brilliant, talented, superstar women completely leave their rockstar jobs to adopt non-rockstar occupations. 

Why did these brilliant, talented, incredible women leave their rockstar occupations? Because they couldn't handle the sexism any more. They had no fight left in them. 

What can you do? Well, sponsor the heck out of / promote the professional women you know. 
1) Talk about women to others: "Jane Smith is doing AMAZING work related to yours, you should check out her papers." 
2) Invite women: "Let's invite Jane Smith as a keynote speaker, her research rocks" "Let's ask Jane Smith to lead this project, I think she'd do a fantastic job." 
3) Suggest women when you're poaching people: "Let's see if we could recruit Jane Smith to our department." 
and, if you're a journalist:
4) Interview women. Some publications do well at this, some are still in the stone ages. There are women scientists out there, and they have opinions and interesting things to say too!

And, if you're google, don't be evil. (Write that one down!).

Tuesday, March 25, 2014

Eight little words inculcate imposter syndrome

The great Maria Klawe, ACM Fellow, AAAS Fellow, president of Harvey Mudd, wrote a surprisingly humbling and honest article in Slate on imposter syndrome.

In some ways, this type of article is good for young women in the field, because they figure if superstars like her can feel it, they can feel it too. i.e., "It's normal to feel this way."

Except, it's not normal to feel this way.

The reason we feel like we don't belong / aren't good enough, is because we've been encultured to believe this since Day 1. The message from the media is passive pink, and rarely are young women cast in roles of lead scientist in film and television. The whiz computer genius in a show usually looks like this:



"That doesn't look like me. Also, he seems really unhappy. I don't belong in computer science."

Readers protest, "But it's just TV! It doesn't matter!"

But it does. This is how kids choose careers. As much as we'd like to think that our annual science outreach visit to our children's classrooms hugely influences students' future career learnings, we're talking marbles vs. Large Hadron Collider. Hollywood is it.

So for the lucky few who manage to beat the cultural odds and enter our field anyway, they have one more major hurdle.

It's not the intellectual requirements of the job.
It's not work-life balance.
And it's certainly not babies!

Nope. It is eight little words that skewer you with a knife. Eight little words that knock you down in one fell swoop.

Eight little words that men never hear.

"You only got here because you're a woman".

Have you ever said this to someone? Have you ever thought this and not said it?

This is an awful, awful thing to say. Why? Because underlying it is the assumption that only men can do computer science. Why on earth would you think that?

I first heard these words as an undergraduate, from someone I thought was a close friend. I felt sick to my stomach. I never felt imposter syndrome before that point. I loved technology, I was good at understanding how it worked, and how to make it do the things I wanted it to do. Up until that point, I assumed my strong technical abilities and grades was why I had been admitted into the program. Surely not my gender!

After I felt sick, I felt mad. Really mad! Who was this joker to tell me I didn't belong here? I'll show him.

Now, I'm fortunate, because I face adversity with stubbornness. It's just my nature. But most people are not like this. They get beaten down with a stick enough times, and they head for the hills. I can completely understand that, I've had my moments.

Here's the thing. Every time you say or even think these eight words, you're beating someone with a stick. You might think it's an innocuous statement, but really what you're saying is, "Go home dumb little girl."


Don't be a boorish bear.

Sunday, February 9, 2014

profTime();

When I lecture, I usually begin by saying something about time. This is lecture x, we're n weeks into the course, we'll cover q next week, etc. At some point, x and n start to get large, and I have this meta moment of surprise that n weeks has gone by, and I start thinking about time.

If you ever go to an academic career mentorship thingy, you will inevitably see a picture of a three-legged stool, and the legs will be labeled with: Research, Teaching, and Service. Then there is this whole conversation about balance, and how when you go up for tenure/promotion, the Service leg should be, say, 2 inches long, and the others vary depending on where you are, but likely both will be very long. (The stool analogy, of course, breaks here. Floating? Frictionless pullies? Eh.).

Rarely do these career talks discuss the day-to-day aspects of professorial life. Some people, maybe Boice or Gilbreth, actually use(d) stopwatches to track every moment of their time. While I don't have any colleagues quite like this, I certainly detect a degree of time optimization awareness that is directly proportional to seniority. (Up until the Emerati, in which case the trend reverses).

What I find a bit troubling is that in the middle of that line are faculty at the associate level -- where they have all the same challenges as before, but now they also have a doubled service load. So they pretty much never have time for a shoot-the-hay chat. Which, frankly, is just about the most fun aspect of our job -- chatting with smart people who share your nerdy interests*.

I suppose I never envisioned a life of the mind, but certainly did imagine less busybusybusyAlwaysbusy culture. I think this is endemic to academia; I don't believe one style of institution or discipline plays a big role.

The good news, for any readers who are faculty-n00bs, is that there is a lot of "on-the-job training" as it were. After you've taught a class a few times, your preparation time dwindles down to nothing. After you've read 50 grant proposals / journal papers / grad applications / etc, you become crazy efficient at skimming and separating the cream from the cruft. And, of course, talks and posters and all that becomes a cinch.

Some things always take a lot of time no matter how you dice it. Writing strong grant proposals, handling personnel issues, and, for me, writing bios***. Though usually you reach a point where most things are good enough. You trust your students and collaborators a bit more than you started, and no longer need to read every word.

-----
(*) That's not to say I don't enjoy chatting with students. But, sometimes you want to talk about nerdy stuff with people from your nerd-era. e.g., the oldest grad students in my department don't get my lame jokes like, "Wow, that seminar felt like handshaking over a 1200 baud modem", or "Wow, that faculty meeting felt like compiling COBOL code on a PDP-11**."

(**) I'm not that old, I'm just making stuff up here. Though just last week, I overheard an undergrad saying something like, "OMG, it took 30 SECONDS to compile my program! SO LONG." And I'm just laughing.

(***) It's my BANE. And not the Christian kind, sadly. Oh, wait, that's Bale. Well, anyway, he probably has his own biographer, the lucky duck.

Monday, December 9, 2013

[#CSEdWeek] My favorite software

Happy Computer Science Education Week. I did my part! I debugged a memory leak with a pre-literate child sitting next to me, wanting to punch the meta-key in emacs. I can't remember if he realized we needed an extra * or I did, but all I can say is that if pointers are so simple a five year old can explain 'em, no grumbling allowed, undergrads.

Anyhow, to kick off CS Ed Week, I'd like to talk about some of my favorite programs. These are small utilities most of you have probably never heard of, but they fill me with great joy.

1. DTerm (OS X)

This, is, hands down, the piece of software I have been waiting for my whole life. It's basically a "command line anywhere" sort of program. See, for some bizarre reason, OS X doesn't allow you to, say, create a text file in the Finder here , like Windows or even some versions of Linux does. I have no idea why, but this was a gross oversight.

DTerm saves me tons of effort. Old way: Launch terminal window, cd tab-tab-tab-tab-tab (or drag from finder), touch foo.txt. New way: Dtermshortcut: touch foo.txt. Done!

2. Quicksilver (OS X)

Along those lines, Quicksilver is also super useful and has accelerated my workflow. Instead of trying to find things (which I'm terrible at), I just type command-period, type the first two letters of the application name, email address, text file, whatever, and boom - there it is. I am so used to this now I have to install it on new machines, or else I can't use them. (Sad but true).

3. F.lux (OS X, Android, iOS, Windows, Linux)

This program is very clever - it dims your monitor/screen to help simulate getting ready for sleep. For someone who ends up foolishly doing work at 11pm or 5am, and who travels through way too many time zones than is healthy, it's nice to give the 'ol hypothalamus a break.

4. Instapaper (All OSes, web-based)

This is the best piece of software ever written. It lets you save a webpage, from anywhere, for all time. (Removing all ads and annoying stuff). It's shareware, but if you give the developer $3, you can also search through your clippings. It's beautiful, well designed, and wonderful for reading lots of news/journal articles on long airplane rides.

I think that's it for now. I've been trying to be like Beki and stop using my inbox as a TODO list, but I'm still experimenting with applications for that. It's my New Year's Resolution. I'll leave you with a CSEdVideo from Obama (h/t CCC blog). Enjoy!


Monday, November 25, 2013

Little Data, Big Problem

As a computer scientist, I think about data a lot.

And as someone who is a fairly private person, I'm particularly interested in personal data. Not only my own, but everyone's. I gape at fellow customers at the store who give their phone number and zip code to the cashier without a thought. I am appalled at friends who post private information publicly - photos, geolocation data, their polical affiliation, their religion, their "likes". Everything from restaurant checkins to where they delivered their baby.

I am shocked that people purchase devices that track their physiological data 24/7, data which is automatically uploaded and shared publicly. I am stunned that people voluntarily give samples of their DNA to 23andme.

The shocking thing is when I mention something about this to someone, I receive one of three responses:
1) "I don't care, I have nothing to hide".
2) "Bah. I'm honestly not that interesting."
Or:
3) "Well, I know X is evil, but it's just so darn convenient. And anyway, all my friends use X. I can't stop using it now."

Never does someone say, "Wow, FCS, you're right - this data deluge is terrifying! And that anyone with cash can buy all our data willy nilly! Yikes! We should lobby the government to regulate the personal information brokering industry."

Never. Yet one word of the NSA spying snafu and POOF - people freaking out. But I think they're freaking out about the wrong thing.

My security friends talk about threat models. "What's the threat model?". I don't think it's the government. The government is far too monolithic, tech-unsavvy, and sequestered to pull off what we see in the Borne movies. And there's no Machine, sitting in a warehouse in Iowa continually monitoring, processing, and understanding the content of every phone call and surveillance camera feed. That's NP hard.

The threat model is - we have no clue. Right now, any person with the means can purchase a large lot of your private data. If you use a credit card, cell phone, or ATM, ever, you're toast.

When people say, "I don't care, I have nothing to hide." I want to whack them with a #firstworldproblems foam bat. It's not the #FWP people I'm worried about. It's the most vulnerable of our society: those who are abused, those who are stalked. Those who are bullied. Those who simply are not technologically savvy enough to realize they have not only hung their dirty laundry out on their closeline, but their entire existence.

I know what Scott McNealy said. But it still pains me. I think about data a lot.

Friday, October 11, 2013

Pop Quiz: How we discuss woman in STEM

As scientists, engineers, and thinkers, I know several of you are interested in the phenomenon of the subtle ways in which women in STEM are diminished by sexist language and behavior. Sticks and stones, perhaps, but even this stuff is critical to addressing if we truly want to make progress and enable a cultural shift. (See also, death by a thousand paper cuts).

In fact, the more I think about it, the more I realize progress relies almost entirely on the shoulders of mass media. Yesterday NPR had a story about Hollywood Health and Society, which consults with writers about how to write correct and useful story lines on healthcare and climate change*. Turns out the majority of Americans learn about science and healthcare from fictional TV- surprise!

So, writers, you have an important job to do. You need to portray scientists as they actually are. No putdowns, no pedestals, and definitely no tropes.

*Ahem*.

Ok, ready for the pop quiz?

Part 1: Read these quotes, and list all the tropes. 

1) "For Janet Yellen, Obama’s Federal Reserve nominee, quiet patience paid off"

2) "Though he says she hasn't been a superstar economist like her husband, George Akerlof, who shared the 2001 Nobel prize, and her achievements have been overshadowed by Bernanke and former Fed chair Alan Greenspan, she is a great role model for women, because throughout she has proved her intelligence, technical expertise, creativity, and her ability to cooperate with others and work hard."

Part 2: Consider the following two Wikipedia summaries**. What's different? (Hint: check the things in red). 



Pencils down!


-------
*We need this for Computer Science. Nearly every computer whiz portrayed in television is a socially inept caucasian man and/or psychopathic underachiever woman. And speaking of which, while I'm happy Elementary attempted to discuss P ?= NP last week, though there were some problems as Lance points out. More importantly, why was the woman a professor at some podunk university I'd never heard of, and the man was a professor at Columbia? And all she did is teach. And, PS, sexy librarian trope.

**This is my next project. It is positively absurd how women are described on wikipedia in comparison to men. Not just scientists - musicians, actors, artists, writers, athletes - pretty much every profession. Women quietly cooperate and have babies! Men invent things and lead.

Sunday, August 18, 2013

How do you lecture?

Computer Science Educators et al., I am curious how you lecture*:

Powerpoint?
Chalkboard?
Hacking on the fly?
Tap Dancing?

What seems to work best for you and your students?

And if you're willing to share, I'd be interested to learn the student makeup (e.g. freshman, grad students, professionals) and general topic of your course.


*By lecture, I am referring to the times in class when you employ direct instruction. I realize this is rarely The Best Way to Teach, but I think it can still serve a useful role. (...She says, right before she is replaced by a MOOC).