Research Projects

My main research interests are in algorithm design and analysis.  Specifically, I enjoy work in online and approximation algorithms (e.g., scheduling, matching, and other combinatorial optimization) as well as in algorithmic game theory (e.g., inefficiency of equilibria, auction mechanisms, evolutionary game theory).  I also advise undergraduate student research that sometimes takes on a more applied or experimental nature. 

For more details, see my publications and my list of research students.

Energy-aware computing.  Companies like Google, Microsoft, and Amazon all sell cloud services and maintain giant server farms.  An important question for such companies is how to most efficiently use these hundreds of machines as the request load on them changes over time.  One way to save on energy costs is to power down servers that are not needed at any given time.  Various algorithms have recently been proposed for determining how many servers one should power up or down based on the request load at that time.  We experimentally analyze these algorithms to compare the performance of them on real world data sets.  Collaborator:  George Sarkar ‘17.

The online transportation problem.  We explore a problem where n server-vertices lie in a metric space and n request-vertices that arrive over time each must immediately be permanently assigned to a server-vertex. We focus on the “egalitarian” bottleneck objective, where the goal is to minimize the maximum distance between any request and its server.  It has been demonstrated that while there are effective algorithms for the utilitarian objective (minimizing total cost) in the resource augmentation setting where the offline adversary has half the resources, these are not effective for the egalitarian objective.  We have also worked on an experimental analysis of various matching algorithms.  Collaborators:  Barbara Anthony, Max Bender ’15, Lillie Schachter ’15, Peter Glennon ’14.

Auction mechanisms for single-minded bidders.  We consider various greedy algorithms (auction mechanisms) for the single-minded bidders auction setting, and we experimentally analyze their performance for the objectives of both maximizing profit as well as maximizing social welfare.  Collaborators: Rodrigo Rogel-Perez ’17, Tyler Wood ’17.

CamelTours.  This is an interdisciplinary project in collaboration with the Department of Anthropology. This project emerged from a marriage of the technical and design-based expertise of computer science with the goals of community-based anthropology at Connecticut College.  CamelTours.org hosts and allows communities to easily and automatically create smartphone-based, self-guided tours for sharing natural and cultural heritage to local communities and tourists alike.  Collaborators:  Virginia Gresham ’17, Julia Proft ’15, Amit Kinha ’13, Dillon Kerr ’14, Evan Gray ’12, Jennifer Blagg ’12, Anthony Graesch (Anthropology).

Scheduling.  We consider the classical problem of scheduling preemptible jobs, that arrive over time, on identical parallel machines. The goal is to minimize the total completion time of the jobs.  A popular algorithm called SRPT, which always schedules the unfinished jobs with shortest remaining processing time, is known to be 2-competitive. This is also the best known competitive ratio for any online algorithm. However, it is conjectured that the competitive ratio of SRPT is significantly less than 2. Even breaking the barrier of 2 is considered a significant step towards the final answer of this classical online problem. We improve on this open problem by showing that SRPT is 1.86-competitive.  Furthermore, we show a lower bound of 21/19 for SRPT, improving on the previously best known 12/11.  We have also considered the online scheduling problem of minimizing the total weighted completion time on identical parallel machines with preemptible jobs. We show a new general lower bound of 21/191.105 on the competitive ratio of any deterministic online algorithm for the unweighted problem and 16−14√11≈1.114 for the weighted problem. We then analyze the performance of the natural online algorithm WSRPT (Weighted Shortest Remaining Processing Time). We show that WSRPT is 2-competitive. We also prove that the lower bound on the competitive ratio of WSRPT for this problem is 1.215.  Collaborators:  Bo Xiong ’13, Tim Nonner, Alexander Souza.

Data plan throttling.  Despite only a small portion of unlimited data plan users experiencing throttling each month, it is a prominent source of complaints from users and a significant concern for mobile network operators. We propose a simple mechanism that allows users to choose when they want their data transmission "fast," and when they want it "slow." Users still have the same cap on total high-speed transfer before being throttled, and hence may still be subject to throttling, but now they are given some control. We propose a basic model of payoffs, and demonstrate that the proposed mechanism would be preferable to the user over the throttling policies currently in place. We then consider the impacts that extend beyond a single user, and provide a framework for determining the aggregate effects of such a mechanism.  Collaborator:  Barbara Anthony

Data stream management systems.  In Continuous Data Analytics, such as pay-per-click applications, and in monitoring applications, such as network, financial, health and military monitoring, hundreds of similar Aggregate Continuous Queries (ACQs) are typically registered to continuously monitor unbounded input streams of data updates. Optimizing the processing of these ACQs is crucial in order for the DSMS to operate at the adequate required scalability. Weave Share, a multiple ACQs optimizer that computes query plans in a greedy fashion, was recently shown in experiments to achieve more than an order of magnitude improvement over the best existing alternatives.  We prove that Weave Share approximates the optimal cost-savings to within a factor of 4 for a practical variant of the problem.  In a related work we also consider the setting of a for-profit business selling data stream monitoring/management services and we investigate auction-based mechanisms for admission control of continuous queries. When submitting a query, each user also submits a bid of how much she is willing to pay for that query to run. The admission control auction mechanism then determines which queries to admit, and how much to charge each user in a way that maximizes system revenue while being strategyproof and sybil immune, incentivizing users to use the system honestly. We design several payment mechanisms and experimentally evaluate them. We describe the provable game theoretic characteristics of each mechanism alongside its performance with respect to maximizing profit and total user payoff.  Collaborators:  Anastasia Kurdia, Shenoda Guirguis, Lori Al Moakar, Panickos Neophytou, Kirk Pruhs, Alex Labrinidis, Panos Chrysanthis.

Fair auctions.  We explore the revenue capabilities of truthful, monotone (“fair”) allocation and pricing functions for resource-constrained auction mechanisms within a general framework that encompasses unlimited supply auctions, knapsack auctions, and auctions with general non-decreasing convex production cost functions. We study and compare the revenue obtainable in each fair pricing scheme to the profit obtained by the ideal omniscient multi-price auction. We show that for capacitated knapsack auctions, no constant pricing scheme can achieve any approximation to the optimal profit, but proportional pricing is as powerful as general monotone pricing. In addition, for auction settings with arbitrary bounded non-decreasing convex production cost functions, we present a proportional pricing mechanism which achieves a poly-logarithmic approximation. Unlike existing approaches, all of our mechanisms have fair (monotone) prices, and all of our competitive analysis is with respect to the optimal profit extraction.  Collaborators:  Katrina Ligett, Aaron Roth, Kirk Pruhs.

Network design and congestion games.  We study the effects of selfish behavior in the Shapley network creation game. We provide new bounds for the price of stability for undirected graphs. We consider the most general case, for which the best known upper bound is the Harmonic number H_n , where n is the number of agents, and the best previously known lower bound is 12/71.778.  We present a nontrivial lower bound of 42/231.8261. Furthermore, we show that for two players, the price of stability is exactly 4/3, while for three players it is at least 74/481.542 and at most 1.65. These are the first improvements on the bound of H_n for general networks. In particular, this demonstrates a separation between the price of stability on undirected graphs and that on directed graphs, where H_n is tight. Recently we also considered what happens when one (or more) of the players is altruistic rather than selfish.  Collaborators:  Talha Mohsin ’13, Junhee Lee ’13, Giorgos Christodoulou, Katrina Ligett, Evangelia Pyrga, and Rob van Stee.

Adaptive queueing at internet routers.  Congestion control at bottleneck routers on the internet is a long standing problem. Many policies have been proposed for effective ways to drop packets from the queues of these routers so that network endpoints will be inclined to share router capacity fairly and minimize the overflow of packets trying to enter the queues. We study just how effective some of these queuing policies are when each network endpoint is a self-interested player with no information about the other players’ actions or preferences. By employing the adaptive learning model of evolutionary game theory, we study policies such as Droptail, RED, and a greedy-flow-punishing policy to find the stochastically stable states: the states of the system that will be reached in the long run.  We have also worked on an experimental analysis of the various queueing policies.  Collaborators:  Shiva Lingala ’14, Evangelia Pyrga.

Price of stochastic anarchy.  We consider the evolutionary game theory solution concept of stochastic stability, and propose the price of stochastic anarchy as an alternative to the price of (Nash) anarchy for quantifying the cost of selfishness and lack of coordination in games. As a solution concept, the Nash equilibrium has disadvantages that the set of stochastically stable states of a game avoid: unlike Nash equilibria, stochastically stable states are the result of natural dynamics of computationally bounded and decentralized agents, and are resilient to small perturbations from ideal play. The price of stochastic anarchy can be viewed as a smoothed analysis of the price of anarchy, distinguishing equilibria that are resilient to noise from those that are not.  Collaborators:  Katrina Ligett, Aaron Roth and Kirk Pruhs.