Tuesday, October 30, 2012

ND Thinks Big

This past spring, I had the chance to be one of the speakers at an event called "ND Thinks Big."  The premise of the event was a set of TED talks oriented around ND-centric research.  By virtue of one of the organizers being a senior in Computer Science and Engineering, I got tapped to give one of the talks.  Over the months following the talk, I had been waiting for the video to be posted to the Thinks Big webpage.  Well, it turns out that the video of my talk was posted actually to the Thinks Big YouTube channel instead.

Without further ado, I give you my ND Thinks Big talk:




PS Wow, the top of my head is a bit sparser than I had thought.  Boo, higher level camera angles.

Monday, October 1, 2012

Reviewing like a boss

Ahh, nearly done (one left) with the INFOCOM stack aside from the meta reviews due in two weeks.  I will have to say that my initial impressions were correct and this was a very, very good stack.  Only one or two surprises but still, quite nice.  As an added bonus, only three of the papers were tedious reads / re-reads to make sure I understood the central premise. 

With an average I think approaching 4k characters per review, not bad either if I do say so myself.  I do attribute that largely to the quality of the papers making it easier to comment on them rather than ugh, how do I say something nice and encourage the authors to do better.  If my stack is any indication of the overall quality, next year's INFOCOM should be excellent. 

Saturday, September 29, 2012

Journals vs. Conferences

I am knee deep in INFOCOM reviews and like most of the TPC (Technical Program Committee), I have of course been a really good reviewer and done all of my reviews weeks in advance.  Or not.  On the plus side, I am on read number two after the initial skim where my trusty red or blue pen gets a work out.  6k+ characters for the last review is none too shabby though it does help when I have done work in that exact particular area.  Anonymity fail perhaps with that one. 

On a side note, I think I got blessed by the random paper assignment overlords this time.  I have to say that I am thoroughly enjoying all of the papers in my stack.  Does not mean that I am giving out all accepts but well done overall.  Though I think I have a huge sample of TPC area leads.  Huge kudos for whomever cut out of the rank this paper field from prior years.  That was always tedious to go back to all of the reviews and rank them once you got through the pile.

Anyway, paging through the reviews, one particular thought struck me.  Given that systems / networking research tends to view itself as being a bit of a "special flower" where journals are optional, does that mean that I as the reviewer for a top tier conference need to raise the bar and require additional completeness in the paper?

For example, suppose a paper is an area where the author (senior researcher) has published for 3+ years.  Said author submits a paper in that same area which reads tremendously well but is fairly incremental.  Should there any consideration for me as the reviewer to note that the author does not go the journal route and hence is unlikely to ever really "complete" the work if it does not end up in a thesis?

The simple answer is that no, I don't think it would be fair for me to take that into consideration as I think that would impose an unexpected burden on the author.  The unforeseen burden is especially keen given that nearly all papers are driven by the students and it is definitely unfair to weigh the student with the prior behavior of the adviser (though for some reason, the converse never applies but never mind). 

The more nuanced answer is that I think this is something that we as a community (networking / systems) need to deal with.  I think we have long ago passed the threshold where one can get by doing "one off" papers that might have some justification for a more conference-centric view.  I don't think a reviewer can do this either as that is bound to end up disastrous.  We already have enough subjective evaluation, let's not add another one to the mix.  It is rather something to discuss at the community level, say a TCCC or other level and I think it has to be a community ethic sort of thing.  

Particularly for those who care deeply that conferences are more important than journals, I think there has to be a serious answer as to when the work gets "complete."  Otherwise, it becomes yet another item to add to criticisms of why CS will have difficulty supporting the conferences-only notion for promotion and tenure.  In a later posting, I'll add my thoughts on the article by my department chair (Kevin Bowyer) on conferences versus journals and how a junior faculty should view them.  If you have not garnered by above, I am largely in agreement and the above is yet another reason why I think he is right for promotion / tenure.  
 
Note: For anyone reading this if you think that I did this to your INFOCOM paper, no I did not and you are happy to drop me a note for me if you want to break the veil of anonymity.  Whoot, go, go, gadget tenure :)   

Sunday, September 9, 2012

Visit - Sprint at the Wireless Institute

One of the nice perks of being at Notre Dame is the wicked, cool set of speakers that we get to have the privilege of listening to.  For this past week, we were the host of Bob Azzi, Sprint's senior vice president of network operations.  Unfortunately due to obligations related my Associate Chair hat, I had to duck out early from the question and answer session and missed the second half of the session.

A few interesting bits from the talk and Q&A that stood out for me:
  • A certain other cell network when they had employed data caps expecting to see revenue generation.  In reality, users heavily capped their behavior when they hit the limit and it ended up not generating any substantial new revenue.  It definitely raises some interesting questions related to our WiFi offloading study as a comment was raised whether or not users will modify their behaviors as they get closer to the cap.  With our Cell Phone Study getting our data services from Sprint, this was not an issue due to the unlimited data plans.  It certainly would have some interesting effects for non-Sprint customers if we could monitor several in our study.  From a larger perspective, I have to think data like that would give execs at Verizon pause as the whole shared data plan sort of thing could get ugly in a heartbeat. 
  • DPI (Deep Packet Inspection) and video translation for the win.  18-20% reduction by whacking certain low hanging video streams from 1080p when the type of mobile device is known (non-tethered too I assume) to be more appropriate for the screen size.  I'll have to do a bit more digging to see what devices are involved but very smart plays to re-encode said video to alleviate the last mile. 
  • As academics in networking, we sometimes can forget that the real network is messy and that any sort of pico / metro cell magic is going to be awful due to the sheer scope / complexity of rolling it out for real.  Techniques like SON (Self-Optimizing Networks) are going to be essential for realizing HetNets.  Right of way / property are powerful things indeed (i.e. how do power / cable / get permission to install a pico or metro cell).
  • Similarly, cell networks are not nearly as monolithic as one would imagine, often involving many agreements across a variety of networks to be able to provide full nation-wide coverage.  It definitely adds a unique complexity wrinkle to deployment that the community needs to be more mindful of.
  • Finally, like all things technology, technology is only one part of it and policy, particularly public policy is a very, very important part.  We really need more engineers who can correctly inform technology decisions.
Hopefully we will be able to post up a video on the Wirelss Institute webpage in the next week or so with a recording of the core part of Azzi's talk.  

Wednesday, August 29, 2012

Android Layout XML - Oh my

Despite the fact that my schedule by in large is fairly awful, I have managed to set aside at least some time each week to do a bit of research / development.  I'll avoid posting which time that is lest someone try to schedule a meeting during that block but the project this week has been to look at finally doing a bit of Android development, particularly on my recently replaced iMac and the Nexus 7.  

The process of getting everything set up was not bad, a few minor glitches here and there, but nothing too severe.  Nice to see that is the case.

Beyond getting the basic Hello World example working, my first project is to try to get a Bluetooth Serial Port Profile (SPP) listener going for the purposes of instrumenting my eBike.  The eBike or really electric motor scooter (it does 60 mph, whoot) was my sanity prize for doing our departmental accreditation efforts.  That and the family vacation to Hawaii last year.  Even then, I am not sure it was enough but lest I dwell too much on accreditation and become grumpy, I'll get back to the task at hand.

Basically, the prototype variant that I had does not have the new fancy display that the current bikes ship with but it does have a serial TTL output.  I found a fairly neat Serial TTL to Bluetooth converter at MDFly and got it rigged up late last year.  At the time, I had been using a simple Serial TTL / USB cable and a bit of C# code on my notebook.  Unfortunately, the bike reared its prototype / test pilot nature this year and has largely been out of action, not allowing me significant riding time until this past week.  Current Motor has been wonderful about troubleshooting and hopefully the wonder that was troubleshooting the bad BMS (Battery Management System) led to better troubleshooting for other riders.

Now that my bike is back, I had been occasionally using a simple Bluetooth SPP application from the Play Store.  However, that is not a whole lot of fun as it does not give me GPS or a display for my bike.  I could also just settle for the simple SparkFun adapter to log things to a microSD but again, where is the fun of not having a full display :)  

The task then this week was to actually build a real Android app together with a dash of Bluetooth.  Not too bad on the core of the Bluetooth (Google's documentation is pretty sweet) but holy lord of obfuscation, the layout code is abysmal.  Granted, I am a bit biased from working with WPF and the C# side of things but wow, it is pretty awful.  I thought WPF had some awful kludges (properties and the magic XAML compilation behind the scenes) but Android layout definitely takes the cake.  I am hoping the awful obfuscation is just a byproduct of the fact that I am using sample code but I am not terribly optimistic.  

If anyone would like to educate me why I am off base and espouse the merits of Android's layout XML over WPF / XAML, feel free to do so.    

Thursday, August 23, 2012

SIGCOMM CellNet

As always, a bit late on the blog post but better late than never with regards to discussing various papers / highlights from SIGCOMM as well as the workshop that I attended, CellNet.

  • Positively wonderful opening note / tutorial.  The organizers (Li and Morley) asked Zoltán Turányi from Ericsson to give a nice over of emerging problems in cellular networks.  It was a nice twist that I think served as a good foundation for the rest of the day.  I am not sure this is viable at a general conference (versus say a key note) but it worked well for the workshop.
  • I was a bit less sanguine about the Open Radio work by Sachin Katta from Stanford.  They had a wicked cool paper at the core SIGCOMM conference but it felt a bit like cognitive radio redux.  That being said, pulling off full speed 802.11n with TI + USRP goodness is pretty awesome and I have to give them serious props for that.  We will certainly have to do a bit of looking into the TI C66x DSP that their group had been using.
With that, we had a small break for coffee though we were by far running late.  It was then that we realized that HotSDN had 180 attendees (wow!).  The good news was that we got back on time in fairly short order for the workshop.
  • Neat paper on Buffer Bloat presented by Sue Moon of KAIST who was subbing for Rhee from NC State.  Bummer was that Sprint was by far one of the worst violators of Buffer Bloat.  Evidently they have an upcoming IMC paper examining tweaks to TCP to try to improve things. Amusing anecdote of noting that AQM (Active Queue Management) is doomed out of the gate which I think is pretty much the community consensus.
  • Two papers by AT&T on the core of their network primarily from a modeling perspective.  Both were invited and some neat graphs on RNC / tower performance with regards to TCP performance.
Nice standard conference lunch though I must say, the Finlandia lunches all week were excellent.  Got a bit crowded with the whole conference there on Tuesday through Thursday.  After lunch, it was time for our talk on WiFi offloading which I can now freely discuss on the blog now that it has finally dropped.  We ended up trading with the Stanford folks so KK's adviser could catch his talk.
  • Needed more coffee for my talk and I realized I cut perhaps one too many slides in the interest of time.  Post lunch + jet lag means a far less than impressive talk.  Evidently Shu (my student) taped it but I'm not sure I will be adding it to YouTube any time soon.
  • KK Yap had a nice bit about using multiple adapters to get better performance.  They used an underlying Linux virtual switch on Android to allow one to fuse together multiple adapters and derive better performance.  He did a nice job presenting his talk.
  • Neat work on multi-path TCP definitely getting into the firm weeds, sort of a different perspective taking the standards route for multi-path TCP rather than the bit more of ad hoc approach of the Stanford talk.  
  • The session wrapped up with a talk by Suman Banerjee about their wireless bus work.  Very cool and some interesting enterprise level problems when you have numerous buses / data plans to manage.  Definitely problems that most academics are not pondering but are likely to be quite important at a macro level.  Very impressive with how far the bus work has come from several years back when he had presented it visiting us at ND.  Well done guys.
Two final talks wrapped things up with one on modeling (not necessarily my cup of tea) and one on assisted GPS, namely how bad the assisted GPS has been implemented on many devices.  Lots of low hanging fruit for vendors to fix.  The modeling issue brought up an interesting point that with 3GPP and the potential to signal back to the base station, how much should we signal back?

The workshop wrapped up with a panel discussing various issues / open items for cellular networks.  The poor panelists barely got in their slides before we started discussing each particular person's results.  The net result was that Li did not get a huge amount of time to go over his thoughts on a cellular SDN which was a bit of a bummer but the back and forth on the panel was excellent.

All in all, it was a nice workshop.  Quite well attended (though not bursting like HotSDN).  Well done Li and Morley!

Thursday, August 16, 2012

Reviews / TPCs


I'll sneak in a semi-related post with regards to TPCs (Technical Program Committee) that relates a bit to McKeown's comment about making the SIGCOMM tent bigger, not as big as SIGGRAPH but about four times as big attendee-wise as it is now.

Having recently gotten back a few sub-par sets of reviews, I am feeling a storm of grouchiness with regards to the overall review process and the on-going conference versus journal debate.  Note that this is in no way related to SIGCOMM as it has been a few years since I have submitted a paper there.  We are much better at self-filtering and as of late, most of our work has not been terribly conducive to SIGCOMM either.  Though my top negative review of all time from SIGCOMM still holds a place near and dear in my heart :)

On to my thoughts / recommendations for TPCs / reviewers:
  • I had posted on this a few year's back but I think it is still the case.  Conference papers are for interesting / thought-provoking work, not necessarily perfect nor complete work.  We had a workshop review two years ago for a paper where the reviewers were asking for essentially a journal quality evaluation rather than thinking about it properly through the lens of the workshop / conference.  For that particular workshop review, I joked with my students that if we had the answers at that time when submitting to the workshop (*cough* workshop at INFOCOM 2011 *cough*), we sure as heck would not have been submitting to some podunk workshop.  
  • Having chaired a few workshops, it makes me shake my head when one gets back any peer-reviewed paper that has less than three reviews.  That is just a failure of the TPC co-chairs at quality control and at worst case, one can always backfill reviews as the co-chair.  That is kind of your job.  Some might object and point out that the ultimate job is to ensure good papers but I would argue conferences are there to encourage feedback and discussion.  Bad reviews outside of the top tier conferences (you sort of have no choice on those conferences) makes me unlikely to submit in the future nor encourage others to do so as well.
  • More recently, there have been a few reviews coming back that contain only reviewer comments but no actual ratings, requiring one to click the link to actually see ratings.  Really?  Are we not CS profs and not otherwise capable of automating such things?    
  • A weird review recently came in where a journal reviewer for our E2E QoS work considered the number of citations with regards to evaluating the work.  Seriously?  I must have missed the memo where the solidity of the science is dependent on how many cites your conference work gets. Then again, the reviewer clearly had a bias against all things QoS mumbling about all that QoS work that thinks about bandwidth as a scalar quantity.  On the plus side, the reviewer's comments were so wrong that the review might inspire me to finally write the dynamics paper I had in mind to prove them wrong as I firmly believe the Roberts work from the mid-2000's is still as valid as ever  when looking at large scale aggregation characteristics of flows (i.e. VBR really does become CBR in the aggregate).  The big trick is how to get the data not being the employee of a large ISP.    
I'll wrap up with a note regarding a nice paper written by our chair about advising junior faculty on the role of conferences versus journals.  I won't steal his thunder but the basic gist is that the notion that systems researchers are special flowers that can only do conferences is not really supported by the data. From the conclusion of the article:

"Look for research questions that other people will care about and that you can see a novel way to approach. Do the best research that you are able to do. Publish your work in the best conferences that you can get your work into. Also publish in the best journals that you can get your work into. Don’t think either-or, think both-and.  ...  Don’t buy into the “only-conferences-matter, conferences-are-better-than-journals” viewpoint. ... the empirical evidence is that this view is wrong." 

Food definitely for thought.

SIGCOMM - Day 1 (Tuesday)

SIGCOMM comments are still being queued up and my CellNet impressions are perhaps a bit delayed.  I should have something later on today now that I brought my notebook to the conference rather than my iPad.

A few comments / thoughts from Day 1 of SIGCOMM 2012:

  • Wonderful keynote by Nick McKeown with regards to his experiences over his career.  He encouraged people to work on problems that industry did not care about (yet) and preferably those problems that make industry angry.  Certainly he is accomplishing that with SDN (Software Defined Networking) and Cisco.  The other interesting aspect is that he puts everything he does in the public domain to make collaborations easier, i.e. no IP (Intellectual Property) at risk means no issues in fighting over IP.  Plus, the bonus argument of taxpayer funds means it should be fully available / free.  
  • Interesting was the tail end of Nick's talk discussing the future of SIGCOMM and pushing the notion that SIGCOMM is still pretty insular and needs more people.  He pitched that SIGCOMM should double the number of papers and aim for 2,000 attendees.  Not entirely bad I would say.
  • Extremely uncomfortable moment during the first paper session.  One of the questioners (in typical SIGCOMM fashion) pointed out bluntly that the paper being presented should not have been accepted as someone already offers the exact service noted for $49 / month.  You could have heard a pin drop after the commenter made his remark.  He was most certainly right though but it was probably a bit much to request that everyone keep track of all various startups / efforts in the space.  Kudos to the presenter for staying quite composed during the questioning.  
  • That being said, the relative insularity of SIGCOMM citations seems still as prevalent as ever.  That paper in question failed to cite anything out of the typical SIGCOMM sphere of conferences (NSDI, CoNext, etc.).  It did seem though that this year's SIGCOMM was definitely broader in terms of the authors.  Quite a few of the usual suspects but not overwhelmingly so.  Stanford's wireless group was heavily represented and they have some pretty cool work (see Day 2 comments coming up). 
  • SIGCOMM definitely does student outreach right.  Very well done with N2Women (Day 2) and the student banquet.  Though with two hundred students in attendance, things probably got a bit diluted due to the popularity and location of being in Europe.  Bummer though that none of the industry sponsors came to the student banquet.  

To wrap up, holy number of analyzed streams by Conviva (Stoica / Zhang and company).  I am thinking that is a space we should just stay away from as it is hard to compete with 2 billion flows that Conviva analyzes.  Simply amazing but man, either I am old or something as their titles on the slides are nearly impossible to read with green text on a white background.

Sunday, August 12, 2012

SIGCOMM 2012

I am currently out in Helsinki this week for SIGCOMM 2012, specifically to present our paper on WiFi offloading at the CellNet 2012 workshop.  It has been just over ten years since I was last in Helsinki for ICC 2001.  I am really looking forward to the CellNet workshop as our paper is probably one of the better ones that we have done in a while in terms of the utility of the results.  Perhaps not our finest writing but the results are quite profound for carriers when debating how much gain they will see for WiFi offloading.

As such, I am going to try do my best to put up impressions of several of the best and perhaps not so great papers / posters / demos from the four days of workshops and posters that I will be attending.  One odd paper on Wednesday on MIMO by Katabi out of MIT.  SIGCOMM seems a very odd place for such a paper though and I will be curious to see how much of a systems-ish focus there is in the paper versus an EE / PHY layer focus.

Friday, August 10, 2012

Content-Centric Networking

Just yesterday, there was an article on Xconomy linked via Slashdot highlighting Van Jacobsen's current work with Content-Centric Networking, aka CCN (see the original CoNext 2009 paper here).  While I have long admired all of the interesting work that Van has done (TCP tweaks, feedback on DiffServ), the whole CCN bit has me shaking my head a bit.  Is there something amazing that I am missing with regards to CCN?  Perhaps I am a bit too jaded from multicast, active networking, DHTs (shudder), and the various packet caching work from a decade or so back but I am failing to see how this is not just yet another effort doomed to fail.  

While the piece does capture some of he issues with CCN and acknowledge several of the shortcomings, I think it several underestimates the difficulty and ignores a few critical issues:

- The comments on scaling of the core are way off.  I'm a bit perplexed by the take and I assume it is just the author of the article taking some liberties.  We are not exactly strained for capacity in the core and the Olympics should serve as a pretty good reminder that we will be A-OK for some time.  That being said, it does not mean we are good at the edge, particularly the wireless edge (hopefully nobody thinks CCN offers anything in the last mile), but I view the core as a largely solved problem that boils down to one of architecture / planning.  Quality of Service (QoS) would have gotten traction a long time ago if we really were that constrained.  There is a reason why I stopped doing QoS work nearly in its entirety though every once in a while I feel the need to occasionally wax nostalgically about unsolved problems from earlier work in QoS.  

- The entrenched players are not just entrenched, they are intertwined in how the very core of the Internet works.  Not in a protocol sense but in its economic core (ads, content integrity, dynamic content, etc.).  I think all too often researchers ignore this at their own peril though there are certainly a few communities I use as illustrations for my students as poster children for these sorts of things.    

QoS / multicast are neat concepts just like CCN is in an abstract / academic sense but these sorts of efforts ignore fundamental economic realities.  There has to be a compelling efficiency gain or cost savings or competitive advantage to justify it.  For IPv6, it took a really, really, really long time for the address space to get scarce enough to justify it.  Back when I started grad school in the late 90's, IPv6 was just around the corner.  14 years later (oy), World IPv6 days are no longer needed as we might actually start deploying it (huge hat tip to the DoD for requiring it).

Fiber / capacity is too cheap and the existing CDNs good enough for this to get serious enough advantage.  Moreover, as several astute Slashdot posters (some days a rarity) point out, the economics of ads and customized content further impair.  I just don't see how this gets out of the toy projects in the lab phase.  I don't view various players looking at the work / participating in workshops as being a great indicator either, of course they are going to look at.  The key is when their Director of Operations is putting serious resources into it, otherwise it is just fun academic work.  

Some enterprising student should do a survey paper on what technologies succeeded / failed and their overall time to success / failure.  I have a sense that there is a corollary to Metcalfe's Law of Networking related to the inertia of entrenched players and the order of magnitude gain required to improve things.

- Integrity / validation is the anchor that is going to prevent CCN from creating that order of magnitude.  At the end of the day, you have to trust the content that you are getting is legit.  Yes, there are various tricks built into CCN that take advantage of new advances in security but it boils down to classic PKI / digital certificates (and verify it is still good - yeah Certificate Revocation Lists) or just hoping and crossing your fingers.  

The net result is a cap on the performance gains keeping CCN from ever offering an order of magnitude better solution over existing CDNs and making it viable.  Or it becomes even more awesome to be a bad guy in the new cloud / CCN universe.  All of my spidey security senses point to this as being a really, really bad idea and the best explanations I have heard with regards to how they make it work amount to a bunch of hand waving that again reduce back down to classical PKI / digital certificate issues.  It is not that the folks doing CCNs cannot make it work, it is that the compromises made to keep it reasonably secure will severely hamper performance gains.  Either that or I need to find some black hats to don.

- Wireless medical and CCN, really?  Let alone the minefield of privacy issues, a big challenge when it comes to digital information, particularly digital health information is control / understanding of where the data is at to properly assess risk.  We already have enough trouble securing our existing data, now we are going to scatter / replicate it so that any crypto mistakes can be multiplied without the capacity to revoke / fix mistakes once it is replicated?  I don't get the sense that a risk averse community (health care) is going to hop on board with condoning something like this, regardless of the magical cryptographic solutions.  There seems too many vectors for attack / failure and CCN would just exacerbate it.  Again, it is not an unsolvable problem, it just seems that whatever gains you get while be wholesale wiped away or wholly excluded (ex. low power computing) in the wireless medical device space as well.

- It trades abstraction beauty for operational complexity.  Let's just say that I would not want to be the poor sap stuck with trying to craft SLAs (Service Level Agreements).  Those SLAs will be awful and probably is something much closer to dark magic than actual solid math.  Putting on an network operation hat makes me shudder to troubleshoot issues.
Am I overly grumpy about CCN?  Sure.  Should I publish this blog post without being a full professor yet?  Maybe :)  Is CCN potentially cool from an academic standpoint?  Heck yes, that is the beauty of academic work.   

But it we have been down these sorts of things before and the amount of attention being paid to the topic seems to far exceed its potential benefit.  I firmly believe that the fundamental principles needed to make CCN secure will limit it from being anything more than a better abstraction and even more, the operational cost of CCN in terms of stability / troubleshooting make it a deal breaker.  

Maybe I just need to figure out a way to cache grumpiness such that I can do a local cache hit in N years to optimize things.

Faculty Job Talks - More Musing

As I had meant to get to earlier this summer and as always, things just keep slipping by, I wanted to comment a bit more about the job talk. This week I wanted to focus on how one defines their research, namely your research and why we want to hire you. From a higher level perspective, there are two key things that I need to walk away from in your job talk from a research perspective.

Skipping over the fact that one can safely assume that you are giving a good, engaging talk, here are some thoughts from an Engineering / Science perspective:

  • I need to understand where your research stands in the greater scheme of things in your field. Part of that is motivating why your problem is important but if I am not in your area, I need to understand where your work pushes the overall research envelope. Some of the best talks in doing this have used 2-D or 3-D charts (they were viz people but even non-viz people can mimic them), i.e. the two design constraints or issues are scale and security, my work pushes the envelope in scaling well and being secure. It allows us on the committee to get a sense of what sort of a research community you would fit into and give a sense to faculty who might be interested in collaborating where your research might plug / complement their own.
  • I need to understand what you personally did to advance the research. Yes, you may have been part of a team effort but particularly for a place such as ND, we tend not to have huge groups of faculty in your particular area. Hence, I need to know that you can function independently outside of the context of your prior support system. Collaborative efforts are great, but what is your part? What did you uniquely contribute or what insight of yours drove the project? It is a bit of what I would call personal cheerleading but remember, we are hiring you, not your entire team (adviser, co-workers, etc.) to come join our faculty. Trust me, your adviser / co-workers will definitely understand, they would most definitely want you to succeed in finding a position. Don't be overwhelming with braggadacio (that can rub people the wrong way too) but far too many people come in a bit too humble / unassuming. This is very hard for most folks, particularly as the trend is towards larger group and inter-disciplinary projects, but it is sort of a necessary skill to survive in today's academic environment. 
We want to hire you, give us that motivation that seals the deal at the talk as to why *your* research is great. Help me understand what you did and how it is complementary to our department.

Friday, July 20, 2012

Faculty Job Talks - Key Mistakes

Very good link with respect to faculty job talks: http://chronicle.com/article/Grim-Job-Talks-Are-a-Buzz-Kill/132843/?cid=at&utm_source=at&utm_medium=en One of the interesting quotes from the article:
I think many faculty members view job talks the way I do: I am giddy whenever I go to one. I'm high on the ether of potential, the magic I saw in your letter of introduction, your vitae, the fascinating things you've done and the promise of what you might do. I'm already rehearsing the negotiations I'll need to have with the dean to get resources for you. So when it's time for you to give your job talk, don't let me down.
I would concur largely with this sentiment. Once we get you to campus through the myriad of the hundreds of other applicants, we feel your scholarship is more than enough to make the cut. I am excited to hear your job talk as this is the work that you have been passionate about for the last N years. It gives me a chance to hear about new, exciting work and hopefully find a great new colleague to collaborate with. But, although I am predisposed to be optimistic about your job talk, a bad one is devastating to your job chances. Not only does it make the job of the faculty member who might support you harder (and certainly the chair's job harder), I have found more often than not that the job talk is a fairly good predictor of future success as a faculty member. Not perfect in that good presentations can sometimes slip through but I have seen a middling talk fairly consistently be a predictor of problems when it comes to promotion / renewal. All in all, just remember that we want to like you. We really, really do. But as the original author stated, just don't let us down.

Wednesday, July 11, 2012

Wow, holy Blogger changes Batman!

Well, who knew that there would be a significant delay between blog postings despite my best of intentions? On the plus side, our accreditation effort is finally complete and I am slowly acclimating to the Associate Chair position and the new responsibilities / things to track. This time I swear, back in the saddle for real (or something).

In the mean time before my longer posts on successful faculty candidate visits, I'll muse a bit on what we have going on in the research front.  Beyond the NetSense study (which is pretty sweet), we are looking at bringing some of our work on self-optimizing networks to the front with respect to data tonnage reduction. As of now, we are toying with the VPN agent on Android to see if we can make a reasonable approximation of libpcap on the phones to avoid the pain / suffering of bringing in a real libpcap.

Theoretically, it may be possible which would be quite exciting as it would save us the trouble of diverting messages to a third part intermediate box (quite similar to what Onavo) does with regards to their "magic box." For those unfamiliar with Onavo, think of a benevolent MITM (Man in the Middle) box that provides object de-duplication similar to what Amazon does with S3, except doing this for a cellular data connection.

We are quite interested in gathering handset-side data with regards to video browsing / performance and pair-wise packet reconstruction is high on the list. Will keep folks posted if we can find a way to do this without the intermediate capture box.

Wednesday, April 18, 2012

Back in the saddle again

Well, time to once again try to restart the blog posting bit. A few weeks ago, we had out Enrico Bertini, for a faculty interview and he pointed to the wonderful things he has done on his blog. Needless to say, I was quite impressed and somewhat ashamed for having let this blog largely go without a post for what, now a year or more. Ouch.

Related to this, there have been a few topics as of late that have been pushing me to get back into the spirit of blog posting. Particularly, the issue of communication is one that has really stuck out, human communication and the art of the proposal and faculty interview. The inspiration to comment on these things was largely drawn from our last round of faculty candidates. Although I am technically on sabbatical this spring (sort of, you would have a tough time telling from my actual schedule), the free time has given me the opportunity to pay a bit more attention to our faculty candidates.

Beyond the fact that the bar for faculty candidates is much, much higher than when I interviewed, one of the things that struck me was that there is not a lot of advice about how to give a good faculty interview / talk. It especially crystallized with one of my colleagues (not at ND) who is out on the interview circuit who had been out in industry. For that person, the academic job interview was a huge shift as the quirks of academia can be very confusing.

Now having sat on the faculty search committee for more years than I can remember, I wanted to offer bits of advice that I have gleaned from process, where folks who seemed to be easy shoo-ins fell short, and how folks came in and surprised us. My plan over the next month or two is to focus first on the job talk itself and then to walk through the nuances of the interview and interactions with the chair / dean.

Having recently come out of a NSF panel that was underwhelming, I wanted to also pepper in various bits about how to write a good proposal. Plus, it will give me a bit of inspiration to get off my duff and finally post a slew of proposals and their reviews (yes, really, their reviews) to provide a few more resources for starting PIs.

Above all, the topics give me a bit more of a push to get back into semi-reasonable blog posting intervals. Who knows, there will probably be some neat technical observations coming from our cell phone study and various comments tossed in beyond that. Until then, enjoy.