Is it time to take control of your university life? You need this informative article. Have you spent days trying to find the right info about relationship between computer science and economics? That’s exactly what this blog is all about.
The following topics also appear at Collegelearners.
JEL classification
C7D6
Keywords
Algorithmic game theoryImplementationLearning in gamesMechanism designNetworks
1. Introduction
Computer scientists and economists share interests in several areas of economic theory, and individuals from both groups have been working together and in parallel for approximately three decades. Interest in the interaction of computer science and economics has intensified in the last 15 years, fueled in large part by the development of large computer networks such as the Web and the Internet. The emergence of these new systems has caused a profound expansion of the questions that computer science has had to address, since these networks operate through the cooperation and competition of many participants, leading inevitably to underlying social and economic issues. At the same time, the Internet has also made possible the development of more overtly economic structures, through the creation of new kinds of markets.
The purpose of this symposium is to introduce economists to recent work in these areas. The interaction of computer science and economics has had an impact on economic theory in three ways. It has introduced new problems โ novel kinds of markets including those arising in the search industry, and new applications including network management and routing, on-line social systems, and platforms for the production and sharing of content. It has raised new issues in areas already popular in economics, including learning, decision theory, market design, network-structured interaction, the analysis of equilibrium quality, and the computational complexity of equilibria. And it has brought new methods to existing problems, including efficient algorithms, lower bounds based on computational hardness, and techniques from discrete mathematics and graph theory.
We focus in this introduction on the topic of market design as it provides compelling examples of new problems, methods, and techniques in a fundamental economic context, and because mechanism design is at the moment the most active area of joint interest. The next section discusses mechanism design. Subsequent sections provide brief introductions to other aspects of algorithmic game theory, learning in games, and networks.
2. Mechanisms and market design
Marshallian and Walrasian equilibrium analysis are not theories of how markets function. Their institution-free approach to predicting market outcomes precludes them from asking questions such as: When do market institutions fail? How do they behave when they fail? How should markets be designed to minimize failure, and what tradeoffs with market efficiency arise in doing so? Research in economics arising from general equilibrium and welfare economics has been concentrated on market imperfections. Computer scientists have paid relatively more attention to the nuts and bolts of market mechanisms and the robustness of market institutions.
These two distinct approaches to the challenges of modeling markets at a detailed level meet in the field of mechanism design. Leo Hurwicz’s research program was a response to the LangeโLernerโHayek debate about the virtues of markets versus central planning. To clarify Hayek’s claim that the virtue of the market is its ability to harness widely dispersed information to achieve social goals, Hurwicz [1] defined a mechanism to be a communication system in which participants send messages to a center, and a function (or correspondence) which assigns to each profile of messages an allocation of commodities (or a set thereof). Modern mechanism design began with Hurwicz’s [2] introduction of incentive compatibility, and with Gibbard’s [3] introduction of the revelation principle for dominant strategy equilibria and its subsequent extension to BayesโNash equilibria. Subsequent work has addressed two general questions. The Implementation problem is concerned with objectives: Can a particular social choice function be implemented over some rich class of environments? The Design problem is concerned with perhaps more practical institutional questions, including: How do particular mechanisms (e.g. first- and second-price auctions) behave? Which auction mechanisms are optimal from a welfare-maximizing point of view, or from a revenue maximizing point of view, or some other point of view? These two questions are not distinct. Although the early implementation theorists (e.g. Hurwicz [4], Mount and Reiter [5]) concluded that markets perform well in โneoclassicalโ environments, their abstract description of markets did not enable the analysis of particular methods of market organization, and they had little to say about non-neoclassical environments. These questions are the locus of most mechanism design today. Outside of neoclassical environments, there is yet no general theory of implementation failure, and so the theory develops one problem at a time.
In extending received implementation theory, computer scientists have raised the issue of complexity in mechanism design in two different ways: Computational complexity, the difficulty of computing the function that maps message profiles to allocations and of finding an equilibrium, and communication complexity, the amount of information that must be communicated by the mechanism’s participants. In practical mechanism design both are clear concerns; one needs to have confidence that the algorithm employed in an online market will reach its conclusion in a reasonable amount of time, and market participants need to be able to participate without extensive communication with the mechanism.1
Computer science has developed a range of ways to measure the efficiency of a procedure or algorithm to solve a problem, based on the resources used by the algorithm; these include the running time, the space or memory required, and the amount of communication required. This also leads to natural measures for the inherent computational complexity of an underlying problem, by considering the minimum resources required by any algorithm to solve the problem. These requirements can be evaluated in either the worst case over all possible initial conditions, or in the average case in a Bayesian environment; for some well-known algorithms, such as the simplex method for linear programming, the difference between worst case and natural measures of average-case performance can be very large.
In a completely analogous way, one can study the amount of computational or communication resources required by a mechanism, and one can define the inherent computational complexity of implementing a social choice function or correspondence as the minimum resources required by any mechanism to implement that function or correspondence. Of particular interest is the question โ first raised by Nisan and Ronen [9] โ of whether a social choice correspondence which is efficiently computable can always be implemented by a mechanism whose outcome is efficiently computable.2 Such a result, if true, would imply that the only obstacles to achieving computational efficiency of mechanisms stem from the inherent computational complexity of the problem being solved, not from the game-theoretic challenge of implementing the solution in equilibrium. Unfortunately, the result is false, as can be seen by considering the problem of implementing social welfare maximization with quasi-linear utility in dominant-strategy equilibrium. The VickreyโClarkeโGroves (VCG) mechanism is unique for implementing social welfare maximization, and requires solving an NP-hard problem (a widely adopted criterion for computational intractability) in settings with an exponentially large discrete set of alternatives, as was observed by Kfir-Dahav, Monderer, and Tennenholtz [10].
In light of this negative result, attention naturally turned to implementing outcomes that achieve close to maximal welfare. The answer to whether the computational complexity of implementing this social choice correspondence differs significantly from the complexity of merely computing it depends on the details of the problem. In some cases there exist computationally efficient algorithms implementing an outcome whose efficiency is within a small constant factor of the maximal social welfare. In other cases, computational constraints can be arbitrarily costly for the efficiency of outcome. An example of this latter case is the work of Papadimitriou, Schapira and Singer [11] on โcombinatorial public projectsโ, that is, selecting k out of n potential public projects to maximize the combined utility of n bidders with submodular valuations over sets of projects. Selecting an outcome that achieves fraction 1โ1/e of the maximal social welfare can be achieved quite easily by polynomial-time algorithms, but they show that mechanisms implementing this social choice correspondence in dominant-strategy equilibrium cannot be computationally efficient. The same phenomenon, whereby incentive-compatibility constraints lead to an exponential blow-up in the computational resources required to solve a problem, occurs even for the seemingly innocuous problem of allocating identical, indivisible goods to a set of bidders, as shown by Dobzinski and Nisan [12].
This strand of research illustrates a methodological innovation of computer scientists. If utility is ordinal or cardinal but not interpersonally comparable, then implementation results are of the form, โmechanism x does (does not) achieve a pareto optimumโ. If utility is cardinal and interpersonally comparable, then aggregate welfare is cardinal, and it becomes meaningful to say things like, โmechanism x achieves at least 1โ1/e of maximal social welfareโ. This allows for a much finer gradation of mechanisms that are not socially optimal โ and since most problems do not admit optimal mechanisms with a tractable structure, the ability to discuss aggregate welfare opens up many new and interesting questions.
Communication complexity asks the question, โHow much information needs to be exchanged between a set of parties who are collectively carrying out a computation?โ This question has its origins in computer science [13], [14]. The following example comes from Nisan and Segal [15], a collaboration between a computer scientist and an economist. Combinatorial auctions are mechanisms designed to allocate L heterogeneous items among N bidders whose valuations for the different objects are not necessarily additive across the items. The revelation principle reduces this problem to that of studying revelation games, but full revelation of a bidder’s preferences requires transmission of a willingness to pay for each of 2Lโ1 bundles of items. That is over 1 billion numbers for L=30. Nisan and Segal provide results both for bits and, for those cases involving continuous information, dimension. Dobzinski and Nisan provide a useful illustration of the types of methods that enable analyzing the communication requirements of mechanism design problems.
This issue of communication complexity was at the heart of the HayekโLangeโLerner debate, and for that reason, among many others, it should be familiar to the economics community. Hayek [16, p. 519] argued that: โโฆthe โdataโ from which the economic calculus starts are never for the whole society โgivenโ to a single mind which could work out the implications and can never be so givenโ. The โeconomic problem of societyโ is โa problem of the utilization of knowledge which is not given to anyone in its totalityโ. Subsequently, Hurwicz [4] and Mount and Reiter [5] demonstrated that Hayek was correct for neoclassical environments, that is, resource allocation problems with no externalities and diffuse market power. Markets are โinformationally efficientโ, that is, they minimize the amount of communication needed to support an optimal allocation of resources, where the quantity of communication is measured by the dimensionality of the space of signals needed to convey all necessary information. Saari and Simon [17] showed that the amount of information needed to find Walrasian prices is nonetheless large. Global Newton methods will (generically) find zeros of excess demand functions. For an n+1-commodity economy they require at least n2+n pieces of information: the n independent values of excess demand and their n2 derivatives with respect to normalized prices. No method (in the sense that the global Newton method is a method) that works on any excess demand function can do better when nโฅ3.
Early on in the mechanisms literature it was recognized that dominant strategy implementability was too strong โ little could be implemented. The economics mechanism-design community quickly took up Bayesian implementation, following [18], Harris and Townsend [19], Holmstrรถm [20], and Myerson [21]. Computer science has considered Bayesian implementations that achieve approximately some design criteria (like maximizing welfare or revenue). But they also have been more aggressive in looking for other, non-Bayesian alternatives to dominant strategies. Lavi and Nisan [60] is an example of this work. They investigate a dynamic mechanism design problem using what they call โset-Nash equilibriumโ: Each player is given a recommended set of strategies and may play an arbitrary element of the recommended set, but the n-tuple of recommended strategy sets has the stability property that if every other player chooses an element of their recommended set, then the left-over player must have a best response in her recommended set. Chen and Micali [61] propose an alternate approach to weakening underlying assumptions in auctions, using a purely set-theoretic model of beliefs.
More broadly, a perspective that guides a lot of work in computer science is to achieve approximately desired outcomes under minimal assumptions about the state of the world. This perspective leads naturally to questions about alternatives to Bayesian analysis, which makes strong assumptions about distributions of problem inputs (e.g. valuations in an auction) and looks to maximize average social welfare or average seller revenue. Robustness aside, Bayesian games are limited in application to the degree they rely on common knowledge assumptions. Robert Wilson [22] has famously written, โI foresee the progress of game theory as depending on successive reductions in the base of common knowledge required to conduct useful analysis of practical problems. Only by repeated weakening of common knowledge assumptions will the theory approximate realityโ. Computer science perspectives can thus be viewed by economists as working in alignment with what has come to be called Wilson’s Doctrine.
One approach to mechanism design with reduced assumptions is a โprior-independent mechanismโ, that is, a detail-free mechanism whose designer does not need to know things like type distributions. Such results are situated halfway between average-case and worst-case analysis: For any given distribution of types, the mechanism’s performance is evaluated in the average case (that is, expected performance with respect to the a priori type distribution) but the theorem that provides the performance guarantee for the mechanism holds in the worst case over all distributions. One sees this type of goal in early examples such as Bulow and Klemperer’s [23] result that a second-price auction with n+1 i.i.d. bidders will get at least as much revenue, in expectation, as an optimal auction with n bidders from the same distribution. (Note that running a second-price auction doesn’t require any knowledge of bid distributions, so this is an example of a detail-free result.) Results in this style have been given broad generalizations by work in computer science, including results of Hartline and Roughgarden [24], and new results of Dhangwatnotai, Roughgarden and Yan [25] who consider a class of environments including single-unit auctions, k-unit auctions and more, with private and independent bidder valuations drawn from monotone hazard rate distributions which are common knowledge among the bidders, but not necessarily the same. They construct a mechanism that, regardless of the prior, guarantees an expected revenue to the seller of at least 1/2 the expected revenue achievable by the optimal mechanism for that environment.
An even more aggressive weakening of Bayesian assumptions takes place in the โprior-freeโ approach to mechanism design, which aims to provide meaningful worst-case revenue guarantees for mechanisms, i.e. non-trivial lower bounds on a mechanism’s revenue that hold for every profile of bids and not just in expectation. This approach is exemplified by the paper of Devanur et al. [62] that develops a framework for prior-free mechanism design and analysis in single-dimensional settings. They provide meaningful and non-trivial revenue lower bounds for a natural random-sampling-based mechanism by comparing its revenue on every bid profile to a benchmark value defined by optimizing over outcomes that are โenvy-freeโ with respect to the same bid profile.
Computer science research has also studied these weakenings of assumptions to ask questions about โwhat isโ rather than about โwhat is optimalโ. Myerson’s optimal mechanism is beautiful, and it is indeed simple when agents have identical type distributions. For non-identical distributions, it differs from the simple auction formats seen in reality (which use anonymous reserve prices, for example). Hartline and Roughgarden [24] and subsequent work have proposed an appealing quantitative argument for the prevalence of simple auctions, showing that they achieve a bounded approximation factor with respect to the revenue of optimal auctions when bidders’ types are single-dimensional and drawn from distributions satisfying a natural regularity condition. Briest et al. [63] provide a contrasting result in a domain with multi-dimensional preferences. They focus on one aspect of simple auctions โ that their outcome is a deterministic function of the bid profile โ and show how the use of randomization in certain settings can achieve unbounded gains in revenue relative to the best deterministic mechanisms. This research program, and subsequent work of Haghpanah and Hartline [26] are inverse results to the classical implementation problem, whereas implementation theorists (e.g. Maskin [27]) ask for which social choice functions do there exist mechanisms which can implement them,3 this program asks for which environments a given mechanism implements or approximately implements a given social choice function, in this case revenue maximization.
Although the analysis of market institutions is associated with the mechanism design of the 1960s and 1970s, there has been independent interest in institutional analysis almost since the invention of game theory.4 Neoclassical analysis analyzes market outcomes by looking at the consequences of small (marginal deviations) and making assumptions (such as convexity) that allow for the inference of global properties from local properties. Theorists such as Gale, Shapley, Shubik and Scarf, on the other hand, foresaw that indivisibilities are not just a technical problem. Distinct from the marginal tradition of neoclassical economics, they saw the problem of optimal resource allocation as a matching problem. Gale and Shapley [29], Shapley and Scarf [30], and Shapley and Shubik [31] introduced one- and two-sided matching. This distinction becomes particularly important in two-sided matching markets, such as matching students to schools and workers to firms, and represents an area of mechanism design that has had a significant influence on empirical economics. Matching markets and related design problems such as kidney exchanges are an active area of current research; a formulation of matching problems involving multiparty contracts is considered by the paper of Hatfield and Kominers [64], who show welfare and stability results for such multilateral interactions.
3. Computation in games
Moving beyond mechanism design, computational complexity is important more generally for game theory. Recent computational work [32], [33] has identified fundamental limits on the ability of efficient algorithms to compute equilibria. These results show how to take very hard instances of computational problems and embed them in a set of strategies and payoffs for players in a game โ with the consequence that if the players are able to reach an equilibrium, they would collectively produce a solution to problems that we believe to be computationally infeasible.
This suggests some inherent limits on the power of equilibrium notions โ that for certain difficult games, we should not expect players to be able to find an equilibrium. However, the kinds of games used to establish these results have a fragile, pathological flavor to them โ based on all-or-nothing situations in which one must essentially find the โsecret combinationโ to solve a puzzle. In contrast, games at the heart of market behavior and other forms of real-world strategic interaction seem to be smoother and more robust โ despite the abundance of sub-optimal outcomes, the system tends to contain cues that guide the behavior of the participants, and forms of feedback that can select for better behavior. A broad research question is to quantify the features of games in real settings that lead to more tractable behavior. In a related direction, one can ask which natural classes of games do not suffer from the difficulty of computing equilibria. For example, Daskalakis and Papadimitriou [65] identify the class of anonymous games where they show that a minimal relaxation of equilibrium conditions eases the difficulty of computing equilibria.
Complexity also has implications for the modeling of boundedly rational agents. One source of bounded rationality is limited computational resources. Neyman [34] and Rubinstein [35] modeled bounded rationality in finitely repeated prisoners’ dilemmas by requiring agents to employ strategies that could be implemented by finite state automata. If the size of the automata (the number of states) is sufficiently constrained, Nash equilibria in automata strategies can support cooperation. With enough states, though, Papadimitriou and Yannakakis [36] showed that finite automata can compute best responses to every partial history, and so the only Nash equilibrium in such automata strategies is to always defect. Halpern and Pass [66] provide new results on the use of costly computation as a model of bounded rationality. Rather than limiting players to computations that can be implemented using finite automata, they model fully general computation using Turing machines, and this motivates alternative ways of quantifying the cost of a computation, such as the number of random bits consumed.
4. The price of anarchy and quality of learning outcomes
The LangeโLernerโHayek debate had to do with the relative advantages and disadvantages of decentralized mechanisms. Although the market proves to be efficient for neoclassical environments, this is not necessarily the case in other environments. But the issue is broader than just the consequences of decentralized information โ for games in general one can ask how well an equilibrium outcome does relative to the social optimum. Koutsoupias and Papadimitriou (1999) first formalized this idea in the context of Nash equilibria by computing the price of anarchy, the ratio of the welfare of the socially optimal outcome of a game to the minimum welfare of any Nash equilibrium outcome. Note that this line of research again illustrates the types of questions that can be asked when one views utilities as cardinal and interpersonally comparable, allowing aggregate measures of welfare to be evaluated.
The price of anarchy has proven to be a useful framework for analyzing a number of settings where self-interested behavior leads to sub-optimal outcomes. An early influential result comes from the work of Roughgarden and Tardos [37] on network congestion. Suppose that travel time on a road increases with the degree of congestion. How would one allocate drivers to different routes on the network so as to minimize average travel time? A centralized solution would assign a particular route to each driver. A decentralized solution has each individual choosing the time-minimizing route given the route choices of others. Among other results, Roughgarden and Tardos show that the total welfare of a decentralized solution with strategic participants is no worse than a centralized optimal solution with twice the level of traffic.
An important limitation of the price of anarchy is that it assumes that the players have successfully coordinated on a Nash equilibrium, which can be a strong assumption. However, many results on the price of anarchy turn out to be rather robust in that they extend to a form of equilibrium that players reach via learning. Many situations give rise to repeated series of strategic interactions, that is, repeated games. Routing is such a game, as are many โsmall stakesโ auctions, such as sponsored search auctions. In these situations it is interesting to study not only the set of equilibria, but how they are achieved when players refine their play as they learn more about other participants and even the rules of the game. In contrast to โhigh-stakes gamesโ where it makes sense to invest significant resources in learning to play well up front, such investments are not worthwhile if the stakes in any single interaction are sufficiently low. Nonetheless we expect players to learn, and the long-run consequences of learning are significant when the volume of interactions is high. Learning in games has a long history, going back to early work on fictitious play [38].5 Much of this work models Bayesian learning, in which case the relevant solution concept, Markov perfect equilibrium, is notoriously complex. Adlakha et al. [67] explore the use of an alternative stationary equilibrium solution concept; they provide conditions guaranteeing that stationary equilibria exist and closely approximate Markov perfect equilibria, and they explore the consequences for the phenomenon of โlearning by doingโ in an oligopoly model.
An alternative to Bayesian learning that plays an important role in the computer science literature is โno-regret learningโ. In a repeated game we suppose that in each stage a player observes the consequences of all his possible actions given the actual play of his opponents. The regret of the action sequence a1,โฆ,aT of length T is the excess of the payoff from the best fixed action in hindsight to the payoff realized by the sequence. A learning algorithm is just a strategy for the repeated game โ a function mapping histories of play into actions. A learning algorithm is no-regret if regret is guaranteed to grow sublinearly in T, no matter how opponents play. No-regret algorithms exist, and they must necessarily randomize. Furthermore, no-regret learning converges to an equilibrium in zero-sum games, and more generally to a weakening of correlated equilibrium called coarse correlated equilibrium, a correlated distribution of actions such that no player would prefer to switch to any fixed action rather than play the given correlated distribution. All correlated equilibria (and hence all Nash equilibria) are coarse correlated equilibria.
An interesting recent development using the idea of learning as a model of game outcomes is that in many games the known price of anarchy results extends to learning outcomes, i.e., the welfare guarantee provided by the price of anarchy result applies even when no-regret-learning players have not converged to a Nash equilibrium. A compelling result on this theme is due to Roughgarden [40], who demonstrates that if a game satisfies a particular payoff condition which he calls a โsmoothness conditionโ, then price-of-anarchy results for that game extend to coarse correlated equilibria, the limits of no-regret learning. The proposed smoothness condition is satisfied by most games with known price of anarchy bounds, including the routing game and games arising in sponsored search auctions. The paper of Roughgarden and Schoppmann [68] weakens the smoothness condition to extend it to a class of games with convex strategy sets, while the weaker condition still implies that the price-of-anarchy results for that game extend to correlated equilibria (though not to coarse correlated equilibria).
5. Applied market design
Economists and computer scientists have also collaborated productively in actually designing markets and in analyzing these designed markets. A leading example is the sponsored search market. Web search companies such as Google and Yahoo! raise billions of dollars annually by selling space beside particular search queries. Current on-line advertising markets look very different from the advertising markets of earlier eras; whereas sales in these earlier forms of the market were determined by individual negotiations, the on-line versions involve an extremely rich structure for the items being sold, based on the attention of individual users as they seek to accomplish specific search tasks. In analyzing and designing such markets, there are opportunities to exploit connections to some of the fundamental models of matching markets that have been the subject of extensive study in both computer science and economics.
The first papers to propose and analyze a model of sponsored search were Edelman et al. [41], Varian [42], and Mehta et al. [43]. Edelman et al. and Varian show that an efficient equilibrium for basic models of sponsored search always exists in the full information setting. The paper of Caragiannis et al. [69] considers this framework for a Bayesian model of sponsored search auctions (specifically, generalized second-price auctions with uncertain quality scores), and shows a small constant bound on the price of anarchy that is robust enough to extend to the incomplete information game even with correlated types (and correlated quality scores).
Prediction markets form another broad class of markets that have become much more prevalent with powerful computational resources and the growth of the Internet [44]. A prediction market is one that is designed for the purpose of inducing a collective prediction from the set of participants โ for example, awarding a dollar if a particular candidate is elected, or a particular team wins a sporting event (e.g. [45], [46], [47]). The notion draws on an idea with a long history in economics, that prices reveal beliefs; with greater computational power and lightweight access to large groups of potential participants through the Internet, prediction markets have become correspondingly more expressive, with award conditions that now extend to complex Boolean predicates on the basic outcomes. A central goal of a prediction market is to be able to sell and buy securities guaranteeing good prediction while limiting the cost to the market maker, and developing computationally efficient pricing mechanisms has been an active focus of research in the computer science community. Running prediction markets online gives rise to some other novel issues as well. For example, good online mechanisms need to be sybilproof, meaning that individuals should not be able to benefit from submitting several reports under different pseudonyms, as such sybils are easy to generate online. Lambert et al. [70] gives an axiomatic characterization of mechanisms for prediction markets satisfying a number of desirable criteria.
6. Networks
Networks have been a basic object in computer science since its initial stages, as the formalism of graph theory has proved to be a powerful language for modeling many of the field’s core applications beginning with some of the earliest: wiring diagrams of circuits, control flow in programs, the structure of communication and transportation networks, and then more recently the topology of the World Wide Web and large social media and on-line commerce sites. These more recent networks have increasingly consisted of interacting strategic agents, rather than components that were centrally designed by a single entity, and so it became natural for economic and game-theoretic considerations to merge increasingly with the computational considerations. Economics also began to investigate graphs as the underlying models for some of its core questions; these include the interactions of participants in a market, and questions concerning the diffusion of information, innovations, and behavior in an underlying social network.
Both fields have considered how a network’s topology can arise as the result of strategic interactions among the nodes; Shenker [48] asked how such considerations would affect the structure of the growing Internet topology, and work beginning with Jackson and Wolinsky [49] in economic theory and Fabrikant et al. [50] and [51] in the theory of computing studied the consequences of self-interested behavior in the formation of network structure.
In addition to the structure of the network itself, there has been significant interest in both computer science and economics on the processes that take place on the underlying network. The routing of traffic on networks is one such process, which as noted earlier formed one of the core settings for analyzing the price of anarchy [37]. Other active lines of work include analyses of the diffusion of information, innovations, and behaviors on social networks (Rogers [52], Blume [53], and Ellison [54]); and the outcome of buyerโseller interaction and bargaining on network structures (Kranton and Minehart [55], Corominas-Bosch [56], and Kakade et al. [57]).
The perspective of computer science is evident in recent approaches to both of these two latter issues. In the diffusion of innovations, one is led to an interesting algorithmic variation on the underlying question by considering how to intervene in it โ in particular, how a centralized agent could optimally โseedโ an innovation through careful selection of the starting nodes. This was asked by Domingos and Richardson [58]; since the optimal choice of seeds is computationally difficult, and lacking in any apparent tractable structure, subsequent work identified rich structural properties of near-optimal solutions that can be found through greedy optimization, by establishing a submodular property of the process (Kempe, Kleinberg, and Tardos [59]). In the context of trading and bargaining on networks, where participants in the network may repeatedly update their offers to others, there are deep connections to distributed updating algorithms related to belief propagation; Bayati et al. [71] develop this connection to provide new results on networked bargaining.
7. Conclusion
There is something natural in the broadening interface between computer science and economics โ each can be viewed as a field focused on the design and analysis of extremely complex synthetic systems that are governed by phenomena we only partially understand and only partially know how to control. And each field increasingly needs knowledge that resides in the other, as computer scientists seek to design and understand systems containing ever-increasing numbers of strategic agents, and economists seek to model the complex networks of interactions that lie between small-group environments and population-scale economic activity.
This introduction has been designed to serve as a roadmap to some of the central themes in the papers that follow. The questions that these papers address in turn point to important issues that will continue to shape the interface between computer science and economics: establishing guarantees for economic processes under increasingly general assumptions; developing methods to evaluate the performance of algorithms and mechanisms even in cases when they may not be the optimal choice; delineating the ways in which computational efficiency serves both to make certain outcomes feasible and to provide evidence for the infeasibility of others; and identifying design principles for the complex environments that arise when strategic agents come together, make decisions, and interact.
YALE Computer Science and Economics
Director of undergraduate studies:ย Philipp Strackย (Economics), Rm. 27, 30 HLH
Computer Science and Economics (CSEC) is an interdepartmental major for students interested in the theoretical and practical connections between computer science and economics. The B.S. degree in CSEC provides students with foundational knowledge of economics, computation, and data analysis, as well as hands-on experience with empirical analysis of economic data. It prepares students for professional careers that incorporate aspects of both economics and computer science and for academic careers conducting research in the overlap of the two fields. Topics in the overlap include market design, computational finance, economics of online platforms, machine learning, and social media.
Prerequisites
Prerequisite to this major is basic understanding of computer programming, discrete math, calculus, microeconomics and macroeconomics. Grades of 4 or 5 on high-school AP computer science, statistics, calculus, microeconomics, and macroeconomics signal adequate preparation for required courses in the CSEC major. For students who have not taken these or equivalent courses in high school, the programming prerequisite may be satisfied withย CPSCย 100ย orย CPSCย 112; the discrete mathematics prerequisite may be satisfied withย CPSCย 202ย orย MATHย 244; the calculus prerequisite may be satisfied withย MATHย 112; the microeconomics prerequisite may be satisfied withย ECONย 110ย orย ECONย 115; and the macroeconomics prerequisite may be satisfied withย ECONย 111ย orย ECONย 116. Other courses may suffice, and students should consult the director of undergraduate studies (DUS) and their academic advisers if they are unsure whether they have the prerequisite knowledge for a particular required course.
Requirements of the Major
The B.S. degree program requires successful completion of fourteen term courses (not including courses taken to satisfy prerequisites) and the senior project. Nine of the fourteen courses are listed below; the remaining five courses are electives. With permission of the DUS and the academic adviser, a student may substitute a more advanced course in the same area as a required course. When a substitution is made, the advanced course counts toward the nine required courses and not toward the five electives.
The required courses include CPSC 201; CPSC 223; CPSC 323; CPSC 365 or 366; ECON 121 or 125; two courses in econometrics (ECON 117 and 123 or ECON 135 and 136); ECON 351; one course in the intersection of computer science and economics (e.g., CPSC 455, ECON 417, ECON 433 or CPSC 474). With permission of the DUS, S&DS 241 and S&DS 242 may be taken instead of ECON 135.
Elective courses are essentially those courses that count as electives in the Computer Science major, the Economics major, or both. Exceptions are courses such as CPSC 455, ECON 417, ECON 433 in the intersection of computer science and economics that count as electives in CSEC or both. At least one such course is required for CSEC, and students may not count the same course as an elective for CSEC. ECON 122 and S&DS 365 can count as an elective, ECON 159 cannot count as an elective. At least two electives must be taken in the Computer Science department, and at least one must be taken in the Economics department. With the permission of the academic adviser, a student may use as the fourth and/or fifth elective (one or two courses) in related departments that do not usually serve as electives in Computer Science or Economics.
Credit/D/Fail Courses taken Credit/D/Fail may not be counted toward the major.
Senior Requirement
In the senior year, each student must complete CSEC 491, a one-term independent-project course that explicitly combines both techniques and subject matter from computer science and economics. A project proposal must be approved by the studentโs academic adviser and project adviser, and it must be signed by the DUS by the end of the third week of the term.
Distinction in the MajorโComputer Science and Economics majors may earn Distinction in the Major if they receive grades of A or Aโ in at least three quarters of their courses in the major (not including courses taken to satisfy prerequisites), and their senior-project advisers determine that their senior projects are worthy of distinction.
Advising
Approval of course schedulesโStudents considering the major but not yet declared should arrange to meet with the DUS during the registration period to ensure that their proposed course schedules are appropriate. Similarly, declared majors should meet with their academic advisers to ensure that they are on track to satisfy all of the requirements of the major. Course schedules must be signed by the DUS each term, and they must be approved by an academic adviser before the DUS signs them.
Transfer creditโStudents who take a term abroad or take summer courses outside of Yale may petition the DUS to count at most two courses from outside Yale toward the requirements of the major. Students who take a year abroad may petition to count at most three courses. Many courses taken outside Yale do not meet the standards of the CSEC major; therefore, students should consult with their academic advisers and the DUSย beforeย taking such courses. Courses taken outside Yale may not be counted toward the major requirements in intermediate microeconomics, econometrics, or the intersection of computer science and economics.
REQUIREMENTS OF THE MAJOR
Prerequisites basic knowledge of programming, discrete math, calculus, microeconomics, and macroeconomics as determined by DUS and academic advisers, as indicated
Number of courses 14 term courses (not incl prereqs or senior req)
Specific courses required CPSC 201, 223, and 323; CPSC 365 or 366; ECON 121 or 125; ECON 117 and 123 or ECON 135 and 136; ECON 351
Distribution of courses 1 course in intersection of CPSC and ECON, as specified; 5 electives as specified
Substitution permitted S&DS 241 and 242 may substitute for ECON 135 with DUS permission; a more advanced course in the same area may substitute for a required course with DUS and academic adviser permission
Senior requirementย CSECย 491
Master’s in Economics and Computer Science Career Advancement and Job Placement
MS in Economics & Computer Science Career Advancement and Job Placement
According to theย National Association of Colleges and Employers, over two-thirds of students who studied computing and roughly 62 percent of economics graduates received a job offer before graduation.
Make sure you leave a good impression by working with our award-winningย Graduate Career Servicesย during your job selection process. From resume review to interview preparation, our career advisors are here to help you throughout your job search.
What Can You Do with a Master of Science in Economics & Computer Science?
Graduates with a Master of Science in Economics and Computer Science can build careers in several industries. From software engineers to professionals in the field of economic research, students can grow in a variety of roles. Have more questions about career paths with an MS in Economics and Computer Science?ย Request more informationย and find out the possibilities with a masterโs degree.
Job Placement Industries
Professionals with this degree regularly find employment in many of the following industries:
- Management and Technical Consulting
- Market Platform Design
- Federal Government research divisions
Job Titles & Average Salary
MS in Economics and Computer Science graduates pursue careers in many of the following roles:
- Data Scientist, Average Salary: $96,072
- Economic Analyst, Average Salary: $61,139
- Industry Economist, Average Salary: $116,630
- Policy Analyst, Average Salary: $58,828
Job Opportunities
Companies and organizations where employees are working in jobs at the intersection of Economics and Computer Science:
- Amazon
- Airbnb
- Apple
- Department of Defense
- FBI
- Uber
- Walmart
Skills Acquired with an MS in Economics & Computer Science
The Drexel LeBow Master of Science degree in Economics and Computer Science provides a deep curricular foundation that will foster expertise in the computational and economic theories and skills needed to excel in your career. Students will have the opportunity to develop skills in key areas including:
- Research Design
- Econometrics
- Marketplace Platform Design
- Big Data Management and Analysis
Leave a Reply