Dean — Nikolay Filinov
First Deputy Dean — Irina Volkova
Deputy Dean for Academic Affairs — Gerami Viktoriya
Deputy Dean for Admissions and Student and Alumni Relations — Irina Lesovskaya
33/5 Kirpichnaya Ulitsa
Public procurement procedures prescribed by legislation not only enhance transparency and competition but also entail certain transaction costs for both customers and suppliers. These costs are important to the efficiency of the procurement system. However, very few previous studies have focused on estimating procurement costs. This paper proposes a methodology for public procurement cost evaluation. We show how procurement costs can be calculated using a formalized survey of public customers. This methodology was tested with a representative group of public customers operating in one region of the Russian Federation. We formulate the policy implications of our study as they relate to the improvement of public procurement regulations and argue that this methodological approach can be applied in other developing and transitioning economies.
For the multiproduct EOQ-models the analysis of several vehicles deliveries feasibility factoring in vehicle capacity is conducted. It is proved that such deliveries with an increase in the number of vehicles that simultaneously used for a single delivery cannot be effective, if a discount is not given for the cost of such a delivery. The necessary and sufficient condition that sets the threshold level of the discount, at which the deliveries by several vehicles are able to compete with the traditional solutions, is established. The cases of numerical calculations are presented in the article.
Modification of the inventory management EOQ-model factoring in the temporary value of money (TVM) and an advance payment of the order on the basis of the credit is given in this article. Its purpose is to present for the managers an appropriate modification of the EOQ-formulas for such models (both for deliveries of one nomenclature of goods, and for multinomenclature deliveries). Usage of these modifications will allow to increase the efficiency of deliveries by taking into account the specificity of appropriate cash flows, if optimisation takes into account TVM (by the scheme of simple interest). Accepted that all payments are being realised from the revenue of the previous delivery. The necessary and sufficient condition for a possibility of such payments is established. The interest rate for factoring in the TVM reflects profitability of working capital of the modelled supply chain. It is defined in a format of the model. It is noted that the optimisation of the models of this type relate with the synergetic effect of increasing the profitability of working capital.
Development of Russian electric power industry in recent years is characterized by a multitude of problems and a decrease in a number of performance indicators. It dissatisfies consumers and encourages them to implement various measures to reduce risks and costs of energy supply. This creates preconditions for the emergence of «active» consumers in the domestic electric power industry. Given this trend it would be appropriate to switch from Supply Side Management to Demand Side Management. This will require the implementation of a wide range of measures, including strategic issues of industry development, legal framework and transition to a customer-centric market model.
The generality of synergetic principles of the processes of autowave self-organization in active medium makes it possible to apply the model, which describes the evolving physicochemical and biophysical systems and is based on the modified system of Fitz-Hue-Nagumo equations, to describe the spatiotemporal behavior of the stock market with its most used pattern such as propagating Elliott waves.
The ecological modernisation of enterprises has led to significant levels of total emissions in the atmosphere. This is a very important and complex issue for the Republic of Armenia (RA). An agent-based model was developed to determine the best trade-offs for the ecological modernisation of enterprises. The aim is to solve the bi-objective optimisation problem, the objectives of which are the ‘Integrated Volume of Total Emissions’ and the ‘Integrated Index of Industrial Production’. The results indicate that it is possible to reduce the total emissions in the atmosphere by more than 20% for a ten-year period. This may be done by keeping up the positive dynamics of industrial production through choosing trade-offs in the frame of the ‘Pareto-optimal ecological modernisation’ scenario. The scenario was obtained with the help of the suggested genetic algorithm, modified for the problem of the binary control of transitions from initial non-ecological states of each enterprise, towards the target state of ecologically pure manufacturing.
This work deals with investment decision in downstream, and as is wellknown no two refineries are exactly alike, even if they are owned by the same company(Cheremisinoff 2001). Each was designed with a combination of several technologies to meet requirements (market opportunities, availabilities, financial capability, environmental realities) (Energy 2009). One of the most commonly used methods for the comparison of refineries is the comparison by the single technical and economic indicator, the so-called “Nelson Index” (Johnston 1996) (hereinafter referred to as the NCI), which shows the complexity of the equipment installed in relation to the primary distillation process. The NCI index indicates not only the intensity of the investment or index value of the plant but also its potential for added value. Thus, the higher the NCI index, the higher the cost of the oil refinery, and the higher the quality and level of its products.
In the process of astronomical observations are collected vast amounts of data. BSA (Big Scanning Antenna) LPI used in the study of impulse phenomena, daily logs 87.5 GB of data (32 TB per year). Experts classified 83096 individual observations (on the segment of the study July 2012 - October 2013). Over 75% of the sample correspond to pulsars, twinkling springs and rapid radiotransmitter, and all other classes of observations belong to hardware failures, interference, the flight of the Earth satellite and aircraft. There were allocated 15 classes of observations.
Such a sample, divided into classes allows using the machine learning algorithms. It has become possible to develop an automated service for short-term/long-term monitoring of various classes of radio sources (including radiotransmitted different nature), monitoring the Earth's ionosphere, the interplanetary and the interstellar plasma, the search and monitoring of different classes of radio sources. Monitoring in this case refers to the automatic filtering and detection of a previously unclassified impulse phenomena.
Currently, for automatic filtering, statistical analysis methods are used. This report examines an alternative method supposed to be using neural network machine learning algorithm that processes the input into raw data and after processing by the hidden layer through the output layer determines the class of pulse phenomena.
Creating a neural network model, trained on a sample and performing a classification of previously unclassified impulse phenomena is performed using the cloud service Microsoft Azure Machine Learning Studio. The Web service has been created based on the model allows classifying single impulse phenomena in real time (Request / Reply) and data sampling for a certain period (Batch processing).
The paper presents the approach for multi-dimensional analysis of marketing tactics of the companies employing network tools. The research suggests onmi-channel distribution tactic of a company as a node in eight-dimensional space. Dimensions for node location are defined by frequency of usage of eight communication channels (friends, acquaintances, telephone, home presentations, printed advertisement, internet, e-mail, door-to-door). The comparison is grounded on measuring pairwise distance between nodes in eight-dimensional space. Pairwise distance measured by Euclidian norm is used as a weight of edge between companies. The smaller the Euclidean distance, the higher is similarity. Further we employ network representation of multidimensional statistics to analyze performance and companies’ characteristic, such as product category, market share, education level and average age of distributors. Empirical implication is approved on the sample from 5694 distributors from sixteen (16) fast moving consumer goods (FMCG) distributing companies from direct selling industry.
In universities and technical colleges with relevant IT qualifications in one semester multiple streams, courses and specializations can use software products for training purposes. IT services of universities should deal with the challenge of creating the infrastructure of educational applications that can support the educational process. We note that the number of specializations which study information technology are growing every year (for example, in HSE there are disciplines-minors, which can enroll students coming from any field). Also in the recent years, online courses have started to become popular. If the load is not planned ahead taking into account future trends, the power of even the most high-tech infrastructure will be insufficient. Calculation of the corresponding load on the infrastructure must be made in the planning process of the disciplines, so that we can reserve appropriate facilities, and thus organize an effective learning process.
Software developers use a variety of benchmarking tools that are complex and do not provide the necessary information for the participants of educational process planning.
This article discusses the construction of a simulation model that supports the educational process planning. The simulation is carried out using the capabilities of the tool AnyLogic 7. The aim of this work is to develop a simulation model designed to estimate the load on the information system used in the educational process. In addition, besides the description of the model, the article presents the results of calculations used for various options of the information system (private cloud or on a server at the university). The simulation results were confirmed by data obtained during practical classes at the university. This model gives us the opportunity to plan the educational process in order to achieve uniformity of the load on the services. If necessary, the model allows us to make a decision about the location of the educational information system: on servers of the university or in a private cloud.
High performance querying and ad-hoc querying are commonly viewed as mutually exclusive goals in massively parallel processing databases. Furthermore, there is a contradiction between ease of extending the data model and ease of analysis. The modern 'Data Lake' approach, promises extreme ease of adding new data to a data model, however it is prone to eventually becoming a Data Swamp - unstructured, ungoverned, and out of control Data Lake where due to a lack of process, standards and governance, data is hard to find, hard to use and is consumed out of context. This paper introduces a novel technique, highly normalized Big Data using Anchor modeling, that provides a very efficient way to store information and utilize resources, thereby providing ad-hoc querying with high performance for the first time in massively parallel processing databases. This technique is almost as convenient for expanding data model as a Data Lake, while it is internally protected from transforming to Data Swamp. A case study of how this approach is used for a Data Warehouse at Avito over a three-year period, with estimates for and results of real data experiments carried out in HP Vertica, an MPP RDBMS, is also presented. This paper is an extension of theses from The 34th International Conference on Conceptual Modeling (ER 2015) (Golov and Rönnbäck 2015) , it is complemented with numerical results about key operating areas of highly normalized big data warehouse, collected over several (1-3) years of commercial operation. Also, the limitations, imposed by using a single MPP database cluster, are described, and cluster fragmentation approach is proposed.
In 2016 a survey was conducted among Russian companies to discover the most common problems associated with flexibility of business process management. A gap between strict process formalization demands and unpredictable nature of many knowledge-intensive operations was identified. The article suggests an approach to facilitate process management via combined context-aware set of methods. Firstly, the key terms are selected to serve as special cause indicators of variation in a process instance based on risk profiles. Afterward a cloud service is called, which automatically analyzes semantic annotation of the concrete process instance. Risk detection service identifies potential operational risks and in case of unexpected process execution complexities notifies users. Finally, expert search service calls for an expert in an organization automatically to create expert community. This novel approach could be used for knowledge-intensive business sectors (such as Research and Development) or in any organization interested in increasing its agility in changing business environment.
Purpose The paper identifies the factors that shape the intensity and perceived effectiveness of communications between heads of manufacturing units of multinational corporations (MNCs).
Design/methodology/approach The paper is based on a survey of heads of MNCs’ manufacturing subsidiaries in Russia.
Findings We found that the intensity of most inter-unit communication channels depends on the speed and magnitude of the changes experienced by manufacturing subsidiaries in products and production technologies. The assessment of the efficiency of a communication channel with high media richness strongly correlates to the intensity of its use.
Practical implications Subsidiary managers are quickly mastering most easy-to-use channels (i.e., e-mail exchange, talking on the phone, reading corporate magazines) by themselves, but are minimizing their participation in time-consuming activities (i.e., corporate-wide and special conferences, arranging informal meetings with foreign peers) unless they have to manage rapid changes in products and production technologies. Thus, to intensify the voluntary use of inter-unit channels with high media richness, headquarters should instill in subsidiary managers the value of cooperation between manufacturing units. Moreover, the effectiveness of inter-unit channels with high media richness should be properly demonstrated to subsidiary managers to assuage their initial reluctance.
Originality/value This paper presents communications between manufacturing units of multinational corporations not as the transfer of abstract knowledge but as routine processes of exchange of detailed information on valuable improvements of the existing practices and solutions to technical and organizational problems common in facility development and mastering new products.
This paper provides a critical analysis of the current strategic actions of Russian manufacturing subsidiaries of Western multinational corporations (MNCs). We retraced the content of strategic actions in various aspects of subsidiary management implemented during 2015–2016 and the activities of strategists of different ranks. We found that some actions implemented during 2014–2016 by MNCs in Russia represent standard strategic practices during downturns. In contrast, other strategic practices (facilities expansion against negative market dynamics and reluctance to change the system of permanent job contracts and abundant employee social benefits) generally contradict with the textbook solution for company strategies during downturns.
There are two directions of research in cloud computing, first examines the economic benefits of the cloud, the second studies the risks that arise in the transfer of information resources in the cloud, but there are very few studies that consider the economic benefits and risks together. The purpose of this paper is to overcome this gap and offer a model for a joint assessment of the benefits and risks arising from the use of cloud computing. Three simple criteria that should help to estimate different sourcing alternatives are proposed: costs, intangible benefits and risks, and simple rules to obtain quantitative values of these criteria are described.
A problem of the designing of rational distribution network for a retail industry is observed in the article. The principle of rationality of a distribution network is described as a connection between a distribution network design and goals / requirements of a corporate strategy with a breakdown by different formats, sales regions and product categories.
The article describes the basic design of the author, aimed at improving the practice of using Lean Production (LP) Russian companies. For a basis of model implementation, application and development of LP were taken proposed conceptual model of the modern state LP, identified the condition that lead to using LP, and briefly shows the data of the testing results of the new methodology.