THE CONTINUOUS-TIME TIME BERNULLI PROCESS
Аннотация и ключевые слова
Аннотация (русский):
A random Bernoulli process with continuous time and a finite number of states (random events) is proposed. The process is obtained by two mutually complementary methods - directly from the Poisson process with an intensity parameter that depends on time and methods of queuing theory, from a queuing system with two parameters. In the first case, the process was formalized on a probability space with measure, as a measurable function of time. The intensity of the Poisson process was considered as a measure. The Bernoulli process for each fixed time was obtained as a conditional distribution from a suitable Poisson distribution. The parameter of the Poisson distribution was determined from the differential equation, in the formulation of which the approximation of the Bernoulli formula by the Poisson formula was essentially used. In the second method, standard methods of queuing theory were used. A two- parameter queuing model was formulated in which for all customer flows the time between occurrence of neighboring customers was a random value satisfying the exponential law. The model was formalized by a system of differential equations, whose analytical solution represented the continuous-time Bernoulli process. In finding solutions, the method of generating functions was used. It is of interest to derive the Bernoulli process both from the probability space constructed for the Poisson process and from the queuing theory model. The authors believe that the proposed process can be generalized to a wider class of functions than that used in the work, down to measurable ones. The possibilities for the practical application of the continuous-time Bernoulli process will undoubtedly be expanded, since its discrete analog is well known in many fields of science and technology.

Ключевые слова:
Poisson, Bernoulli processes, systems of stochastic differential equations, queueing theory, generating function, probability distribution, Markov process
Текст
Текст (PDF): Читать Скачать

INTRODUCTION Mathematical modeling based on the methods of the theory of random processes is very actively used in the study and analysis of the functioning of real objects of varying complexity. Two groups of models can be distinguished - some characterize objects from general positions, in the first approximation and reflect the common, most important trend of development and analysis of real processes, others reveal their hidden essence, help to explore the internal properties of objects. Models of the first group, usually prognostic, are not very parametric, they have good statistics, and analytical solutions, representable by formulas convenient in engineering calculations, or easily computable are preferable for them [1-8]. Models of the second group are multiparameter, heavy, statistics are not always reliable - algorithmic methods of investigation are usually effective for them, usually approximate, simulation modeling [9-14], etc. Stochastic modeling is applied to objects, usually associated with randomness, but they are also used in research, even deterministic real objects, using randomization. This often leads to more effective results than traditional mathematical models of the exact sciences can provide. Many random phenomena and objects representing dynamical systems of a special type are effectively described by methods of queuing theory (QT) [2-8]. In the QT, the queuing system (model) (QS) is first formulated, which, in the continuous case, is formalized by a system of differential equations. The unknown time functions in the systems of equations are the probability distributions usually associated with the Markov random process [2-7, 13-16]. Analytical solutions, although exist, but are found, basically, by approximate methods (in real time in the presence of high-speed computing tools). This is often due to the lack of a special need to look for exact solutions, since private, target problems that are not directly related to mathematics are being studied. In recent years, significant results have been achieved in the development of methods for calculating such systems, but they are reduced to the improvement of numerical algorithms [10]. The model given in this paper, formalized by a system of differential equations, has an analytic solution, which is a random Bernoulli process. When finding solution, the method of generating functions was used [9, 16]. The model is formulated in the terminology of QT. The solution obtained has direct Please cite this article in press as: Pavsky V.A., Pavsky K.V., and Ivanova S.A. The continuous-time time Bernulli process. Science Evolution, 2017, vol. 2, no. 2, pp. 79-87. DOI: 10.21603/2500-1418-2017-2-2-79-87. Copyright © 2017, Pavsky et al. This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http:// creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license. This article is published with open access at http:// science- evolution.ru/. practical applications, for example, to the analysis of the Poisson process. If the intensity itself is a random efficiency of the functioning of computer systems [2-5, function  (t) , the process remains Poisson, provided 11, 12], processes arising in chemistry and the chemical industry, biotechnology [5-8, 17], etc. MATERIALS AND METHODS Necessary definitions and notations that with t   if there is a limit, (t)    0 . Markov process In the QT terminology of, queue systems (QS) are considered that are described by a random Markov type A probability space is a triple (, , ) , where process with a countable number n of states (starting   {} is the space of elementary random events  , with zero), postulates of the Poisson process [15, 16, 21,    is the algebra of random events, and P is the 25, 26] and a set of unknown time functions that form probability (probabilistic measure, probability distribution) [19-22]. To the fundamental properties of the distribution probabilities. Suppose that there is a queue system (QS) a probability space, we assign a random variable consisting of a finite set of states {Ck } ,    () and its mean value - the mathematical k=0, 1, 2, …, n. The time of residence of the system expectation M [23]. in each of the states is random, described by the exponential distribution law. The transition of the A random process is a measurable function system from state to state over time t  0 is carried  (t)   (t,) of two variables, one of which is time. As a rule, time fixes the momentary state of the process [23]. For a fixed time t, the random process is a random variable [21-23]. The random Bernoulli process in the literature is usually associated with random walks [16, 19, 25, 26]. There are n independent tests, in each of which event A appears with a constant probability p and the event A out abruptly. The assumption of the exponential law of the distribution of the time of residence of the system in each of the states is equivalent to the fulfillment of the condition of absence of aftereffect and ordinariness, and if the parameter of the exponential law is constant, then the stationarity [2, 4, 21, 24]. The fulfillment of these conditions makes it possible to construct a system of linear differential appears with probability q  1 p , then the probability equations for the QS transition from a state to a state that event A appears exactly k times is calculated by the Bernoulli formula determined by a marked state graph, whose structure depends on the object under study [2-9, 14]. Let us k k nk denote by Pk (t, i) the probability that at the time Pn (k)  Cn p q , k  0,1,2,...n . (1) t [0, ) , the QS is in the state k, provided that at the Let  (t) be a random variable for any fixed time t, initial time t  0 it was in the state Ci . The random characterizing the number of occurrences of the event A process under consideration is formalized by a system in n independent tests, then the probability P  {Pk (t)} of differential equations that describe the connection between the probabilities that, at time t, there are k is a probability distribution and is defined by the formula requirements in the QS. n P{  k}  C k p(t) k n q(t) n  k , Method of generating functions The method of generating functions makes it possible  k 0 n C k p(t) k q(t) nk  1 , t [0, ) . (2) to convolute a system consisting of an arbitrary number of differential equations for the probability distribution of the system residence in each of the states, to a single For each value of the fixed time parameter, let us denote ( t , t , Pt } , t [0, ) , the probability space of partial differential equation of the generating function, and then to solve it. Suppose we have a probability distribution {Pm (t, i)}, the random Poisson process  (t) , where t  {Ck },  , or some other sequence of functions. Let t   is the algebra of events, Pt  P{ (t)  k} . The  Pm (t,i)  1 m  0 rejection of the stationarity condition [15, 16, 19, 22] us note each member {Pm (t, i)} of this sequence by leads to a model of a Poisson process with a variable intensity measure  (t) multiplying it by the complex variable z to the extent that it is equal to the serial number of the sequence k m P{ (t)  m}  (t) e(t) , t [0, ) , member. From the obtained products will compose the functional series z  Pk (t, i) we m! m  0,1,2,... .  ( , )    m ( , ) Let us note that if the intensity of the event stream is F z t z m P m0 t i . (3) given by a nonrandom integrable function   (t) , and Since | Pm (t, i) | 1 are bounded, the series (3) the flow corresponds to the conditions of the absence of aftereffect and ordinariness, it loses the properties of the converges, and, consequently, function in the circle | z | 1. F ( z,t ) -is an analytic The function (3) F(z,t) is called generating for the =B are Borel subsets of I; P is the Poisson distribution. distribution of the original {Pm (t, i)}. The original for any m is uniquely reconstructed from the formula A Poisson process with a state space defined on a probability space is a measurable function defined on the set of all countable subsets of the Borel set. Pm (t, i)  1  m F(0, t)  m! zm . (4) Suppose that for each fixed t, I=I(t) -a measurable space of states with finite measure and Borel subsets of its state is defined, then Next, we need the linearity property and the Laplace image of the derivative of the generating function [4, 5, 22, 24-26]. The linearity property of the generating function makes it possible to reduce the system of differential P{ (t)  m}  Vm (t)  (t)m exp((t)) / m!, (5) where  (t) is the Poisson process intensity parameter, M (t) is a mathematical expectation, (t)  M (t) . According to the definition of density, the intensity equations for the unknown distribution {Pm (t, i)} to a (t) can be regarded as a measure of a random process single partial differential equation of the function F(z,t) F ( z,0)  z i (t) [25, 26], the measure of the space  is finite under the initial conditions and (t)  ( I (t))   . It is required to find Pn (t, k ) - the F (0,t)  Pi (t,i) . probability that for each fixed point of time t [0, ) RESULTS AND DISCUSSION Formulation of the model and the number of the first points of space  , up to n inclusive, exactly k of them is the event A (t), then the remaining (n-k) points belong to the remaining Let (, , ) be a probability space in which a event A(t) . Taking into account the additive measure Poisson process    (t) is given for each fixed time property, we have ( A(t))  ( A(t))  ( I (t)) . Using t [0, ) , where   R  , I  [0, a) , 0  a   , a  R ; the conditional probability formula, we obtain k nk ( A(t )) exp(( A(t ))( A (t )) exp(( A(t ))n! P{A(t)  A(t) / I (t)}  Vk (t)Vnk (t) / Vn (t)  k!(n  k )!(I (t))n exp(I (t) Or P{A(t)  A(t) / I (t)}  Ck   ( A(t)) k  .  ( A(t)) nk  n   ( At))   ( A(t))   A(t))   ( A(t))  The probability distribution has the form   k   nk . (6) P (t, k )  Ck  ( A(t )) .  ( A(t ))  n n   ( A(t))   A((t))    ( A(t))   ( A(t))    , k  0,1,..., n . (7) It is natural to call the formula (7) the Bernoulli process generated by the Poisson process [15, 25]. Let us introduce the notations d p(t)  p(t)   dt . (10) p(t)  ( A(t )) q(t)  ( A(t )) The Bernoulli process can be written in the form ( A(t))  ( A(t)) , ( A(t))  ( A(t)) , n Pn (t, k )  Ck p(t)k q(t) nk , p(t)  q(t)  1 , p(t)  q(t)  1 , t [0, ) . (8) If the time interval between the occurrences of the neighboring events satisfies the exponential law, then where p(t) , q(t) t [0, ) , (11) satisfy the formulas (9); initial ( A(t))  np(t) , ( A(t))  nq(t) , and (I (t))  n conditions Pn (0, n)  1 , Pn (0, k )  0 , k  n . Setting the initial conditions into account (8), we find p(0)  1, q(0)  0 , taking Under arbitrary initial conditions, the formulas (11) become significantly more complicated. Comment p(t)     exp((  )t) If we go to the limit t   in formulas (9), putting q(t)           , exp((  )t) Pn (k)  lim Pn (t, k) t , p  lim t p(t)      ,       . (9) q  lim q(t)   Formulas (9) are solutions of the differential equation [2], t    , then we get the usual Bernoulli formula, expressed in terms of the intensity parameters ,  For convenience, let us introduce the notations: Let Pk (t,i)  Pk (t) , and the initial conditions i, denoting the n n  P (k )  C k    k       nk QS state at the initial point of time discussed separately, if necessary. t  0 , will be           . (12)   Formulation of the model Here the sum    of the parameters is considered as a measure of the state space of the Poisson process. Let us note that in fact formula (12) is a Bernoulli process for a stationary regime, if we use the terminology of queuing theory (QT) [21]. The Bernoulli process can be obtained in another way, using the methods of queuing theory, within which it becomes clear the appearance of the There is a QS containing n customers. Each of them can be in one of two incompatible states. Transition of a customer from one state to another and back satisfies the Poisson postulates with intensities  and , respectively. It is required to find the probability Pk (t) that, at the time t [0, ) k of the customers are in one state and n-k customers are in the other, provided that at the initial moment of time the QS was in the differential equation (10), from which formulas (9) are derived. state Ci .  d  dt P0 (t)  n    P0 (t)    P1 (t),   d The model is formalized by a system of differential equations [15]   dt Pk (t)  ((n  k )  k)  Pk (t)  (n  k 1)    Pk 1 (t)  (k 1)    Pk 1 (t), 1  k  n,  d   dt Pn (t)  n    Pn (t)    Pn1 (t), with initial conditions (13) Pi (0)  1 , Pk (0)  0 , k  i , i  0,1,..., n . (14) The system (13) with the initial conditions (14) is called the Bernoulli process equations system. For the probability distribution {Pk (t)} , the normalization condition [15] holds, which is a consequence of the formulation of the model. n  Pk (t)  1 k 0 , (15) Formula (15) is also valid for a countable number of QS states, if n   . Let us solve the system (13), under the initial conditions (14). Differentiating n F (z, t)   z k  Pk (t) k  0 with respect to the variable t and z, we obtain, respectively, F (z, t ) t n   z k k  0  dPk (t) dt , F (z, t ) z n  k  z k 1 k 1  Pk (t) . Summing the first equation of system (13) with the second, multiplied by zk obtain and with the last, multiplied by zn , we n n 1 zk  dPk (t)  n    P (t)    P (t)  zk  (  [(n  k)    k  ]  P (t)  (n  k  1)    P  (t)  dt k 0 0 1 k 1 k k 1 n1  (k 1)    Pk 1(t))  zn  (n    PN (t)    PN 1(t))  n    P0 (t)    P1(t)  n    zk  Pk (t)  k 1 n1 n1 n1 n1  k  zk  Pk (t)    k  z k  Pk (t)  (n  1)     zk  Pk 1(t)    k  zk  Pk 1(t)  k 1 k 1 k1 k 1 n 1    (k  1)  zk  Pk 1(t)  n    zn  Pn (t)    zn  Pn 1(t) k 1 . dF (z, t)  n    P (t)    P (t)  n    F (z, t)  n    P (t)  n    z n  P (t)    z  F (z, t)    n  z n  P (t)  dt 0 1 0 n z n    z  F (z, t)    n  z N  P n2 (t)  (n 1)     z k 1  P n2 (t)    (k 1)  z k 1  P n (t)    k  z k 1  P  t n k k 0 k 0 k k k 2 n    z n  P (t)    z n  P  (t)    P (t)  n    F (z, t)    z  F (z, t)    z  F (z, t)    n  z n  P (t)  n n 1 1 z t N n  (n 1)    z  F (z, t)  (n 1)    z n  Pn1 (t)  (n 1)    z n1  Pn (t)     k  z k 1  Pk (t)  k 2 n z     zk 1  Pk (t)    F (z, t)    P1(t)  n    zn  Pn (t)    zn  Pn 1(t)  n    F (z, t)  k 2    z  F (z, t)    z  F (z, t)  (n 1)    z  F (z, t)  (n 1)    z n  P  (t)  (n 1)    z n1  P (t)    z 2  F (z, t)  z z n 1 n z  (n 1)  z n  P  (t)    n  z n1  P (t)    z  F (z, t)    z n  P  (t)    z n1  P (t)    F (z, t)  n 1 n n 1 n z    z n  P  (t)  n    F (z, t)    z  F (z, t)    z  F (z, t)  (n 1)    z  F (z, t)    z 2  F (z, t)  n 1 z z z    z  F (z, t)    dF (z, t)    n  (z 1)  F (z, t)  (z 1)(z  ) F (z, t) dz Thus, under the assumptions made, the system of equations (13) is reduced to the following partial differential equation [15-21]: z    1  z z e(  )t  c1 , . F  (  z  ) n  c2 . F (z,t)  (z 1)(z  ) F (z,t)  N(z 1)F (z,t) Let us write the solution of equation (16) in the form t z F  (  z   )  n  g z    e(   )t  . (16) The general solution of equation (16) can be   1  z   , (17) represented as an arbitrary differentiable function where g=g(z, t) is an arbitrary continuous and v  g(u) , where u(F, z, t)  c1 , v(F , z, t)  c2 , - are differentiable function. the equations of characteristics. Let us solve the equation (16). For the characteristics we have the system: Further we have  z    dt  dz  dF F  (  z   ) n  g  1  z  e(   )t   . 1 (z  1)(z  ) n    (z  1)  F , Noting that for t = 0, F (z,0)  zi , which follows where F  F (z,t) , from which we choose the equations: from (14), then dt dz  z     1 (z 1)(z  ) , z i  (  z   ) n  g   1  z  . dz (z  1)(z  )  dF n    (z 1)  F . Let us find the function g, let y  z   The solution is found by the standard method, dividing the variables: then 1 z , z  y     y . In the new notations, we obtain g y  y n  ( y  )i  (  ) n  ( y  ) n i . For arbitrary t, the argument of g is (17) z    e(  )t     i     n 1  z ,  y    y    g y     y     y      or Therefore, substituting this value for y, we obtain  gy   z    e ( )t n    z     e( )t i     (  ) n   z     e( )t  ni      1 z   1 z   1 z  .  z   n  en(  )t  (z  )  e( )t    (1  z)i  (  )n  (z  )  e(  )t    (1  z)ni Taking into account (13), we have: F (z, t)  (  ) N    (1 e ()t )  z  (    e ()t )i      e ()t  z    (1 e ()t )N i .. (18) Let us introduce the notations then to find Pk (t) , from formula (2), we can apply the r(t)    1  e (  )t  Leibniz formula [8] to the product of the functions (17)    , d k   k  k  A (z)  B (z)     A(k  j) (z)  B ( j) (z) s(t)  1         e(  )t  , (19) dz k . j0 j  (t)  1         e ()t  , Substituting in it the values of the corresponding derivatives of the functions A(z) , B( z) we obtain k  i   n  i  k  j  (t)    1 e ()t  Pk (t)   j    k  j   r (t)     , (20) j 0     where s(t)  1  r(t) ,  (t)  1  (t).  sn i  k  j (t)  i (t)  i  j (t) (22) Then (14), taking into account (18), (19), can be written in the following form Taking into account (19), (20), we finally write down ni F(z,t)  s(t)  z  r(t)ni   (t)  z (t)i  P (t)        e(   )t   . (21) Let us apply the formula (2). Let us note that the 0         i function F(z, t) is the product of two functions. Let us         e(  )t  , (23) assume that A(z)  s(t)  z  r(t)ni , B(z)   (t)  z  (t)i , for k  0         k  i   n  i     j ()t     ()t  k  j Pk (t)   j    k  j            e            e   j0           i j      nk i j            e ()t              e ()t   . (24) For zero initial conditions i=0, for each fixed time, t t , we obtain formula (11), taking into account (9). In the stationary mode of the QS functioning, taking into account equality k  i   n  i   n          j 0 j   k  j   k  , (25) regardless of the initial conditions, we obtain the formula (12) already known to us. The system of equations (13) can be regarded as a formalization of the Bernoulli process with two parameters, considered as intensity measures, with a discrete number of states and a continuous time [22]. The introduction of a measure admits an extension of the class of functions up to measurable [25, 26]. Let us obtain a useful result from a model that has important practical applications [2-8]. We will assume that the customers come from an inexhaustible source of intensity , and are served with the intensity  for the same discipline of the queue and Poisson postulates. Then, assuming n   , we obtain the system of equations from the system (13)  d  dt P0 (t)    P0 (t)    P1 (t),   d P  dt k (t)  (  k)  Pk (t)  (k 1)    Pk 1 (t)  (k 1)    Pk 1 (t), (26) Let us introduce the generating function (1),  F (z, t)   zk  Pk (t) . at zero initial conditions P0 (0)  1 , Pk (0)  0 , F(z,0)  1 , k  0 . k 0 Applying it to the system (26), we obtain a partial differential equation1 F (z, t)  (z  1) F (z, t)  (z  1)F (z, t) t z . (27) By analogy with finding the solution of the partial differential equation (16), the solution of this equation under the initial conditions, i=0, has the form F (z, t)  exp( / (z 1)(1 exp(t))) . For the probability distribution we obtain (M (t)k Pk (t)  k1 exp(M (t)) , M (t)  exp( / (1 exp(t))) , This is the Poisson distribution with a variable parameter  (t)  M (t) . Comment. It was shown in [7, 8] that for relatively arbitrary initial conditions, we can obtain from equation (27) the following system of ordinary differential equations for the mathematical expectation and variance  d   dt Mi (t)    Mi (t)  ,  d Di (t)  Mi 2 (t)  Mi (t) 2Di (t)  Mi 2 (t)  Mi (t) 2  Mi (t),  dt Under the initial conditions (28) Mi (0)  i , Di (0)  0 . (29) The solution of (28), taking (29) into account, has the form  Mi (t)       i        e      t ,  i D (t)  (1  e t )     i  e t .     (30) All solutions in a stationary mode: Probability distribution, mathematical expectation and variance are known characteristics of the Poisson process pk  ( / )k / k!(exp( / ) , M   /  , D   /  , and the probability distribution and moments (30), for each fixed value of time, and zero initial conditions, are shown to be a Poisson process with two parameters formalized by the system of equations of Corollary 1. 1The first equation (27) and its solution were obtained by Palm S. in 1947. The generating functions were introduced by L. Euler in the middle of the 18th century, considering numerical series. CONCLUSION The paper presents a random Bernoulli process with continuous time. A probability space is defined on number of the first n states of the Poisson process led to the creation of the Bernoulli process with two parameters, n is the measure of the state space and the which a real random variable is given that is a measure of the intensity is the parameter (t) , where measurable function associated with the theory of a measure that has a Poisson distribution. A space with a countable set of states of a Poisson process is defined for each fixed point of the time axis. As a measure of a countable number of subsets of a -algebra, a finite measure of intensity, depending on time, is adopted. Convergence is guaranteed by the Poisson process existence theorem [18]. Consideration of a finite (t)  n  (t) . The methods of queuing theory formulated by the QS formalized by a system of differential equations, the solution which is the continuous-time Bernoulli process.
Список литературы

1. Khinchin A.Ya. Matematicheskie metody teorii massovogo obsluzhivaniya [Mathematical methods of queuing theory]. Мoscow: USSR AS Publ., 1955. pp. 3-122.

2. Pavsky V.A., Pavsky K.V., and Khoroshevsky V.G. Vychislenie pokazateley zhivuchesti raspredelennykh vychislitel'nykh sistem i osushchestvimosti resheniya zadach [Calculation of the survivability of distributed computing systems and the feasibility of solving problems]. Iskusstvennyy intellekt [Artificial Intelligence], 2006, no. 4, pp. 28-34.

3. Khoroshevsky V.G. and Pavsky V.A. Calculating the efficiency indices of distributed computer system functioning. Optoelectronics, Instrumentation and Data Processing, 2008, vol. 44, no. 2, pp. 95-104.

4. Pavsky V.A. and Pavsky K.V. Stokhasticheskoe modelirovanie i otsenki razmera strukturnoy izbytochnosti masshtabiruemykh raspredelennykh vychislitel'nykh sistem [Stochastic modeling and estimation of the structural redundancy size of scalable distributed computing systems.] Izvestiya YuFU. Tekhnicheskie nauki [Izvestiya SFedU. Engineering Sciences], 2014, no. 12 (161), pp. 66-73.

5. Pavsky V.A. and Pavsky K.V. Stochastic simulation and analysis of the operation of computing systems with structural redundancy. Optoelectronics, instrumentation and data processing, 2014, vol. 50, no. 4, pp. 363-369.

6. Yustratov V.P., Pavsky V.A., Krasnova T.A., and Ivanova S.A. Mathematical modeling of electrodialysis demineralization using a stochastic model. Theoretical foundations of chemical engineering, 2005, vol. 39, no. 3, pp. 259-262.

7. Ivanova S.A. Studying the foaming of protein solutions by stochastic methods. Food and Raw Materials, 2014, vol. 2, no. 2, pp. 140-146.

8. Ivanova S.A. and Pavsky V.A. Stochastic modeling of protein solution foaming process. Theoretical foundations of chemical engineering, 2014, vol. 48, no. 6, pp. 848-854.

9. Kleinrock L. Queueing systems, volume 1. Theory. New York: Wiley interscience, 1975. 417 p. [Russ. ed.: Kleinrock L. Teoriya massovogo obsluzhivaniya. Мoscow: Mashinostroenie Publ., 1979. 432 p.]

10. Marchuk G.I. Metody vychislitel'noy matematiki [Methods of computational mathematics]. Мoscow: Nauka Publ., 1977. 456 p.

11. Ivnitskiy V.A. Teoriya setey massovogo obsluzhivaniya [The theory of queuing networks]. Мoscow: Physico-Mathematical Literature Publ., 2004. 772 p.

12. Nazarov A. A. and Lyubina T.V. Nemarkovskaya dinamicheskaya RQ-sistema s vkhodyashchim MMP-potokom zayavok [Non- Markov Dynamic RQ-system with incoming MMP-flow of customers]. Avtomatika i telemekhanika [Automation and telemechanics], 2013, no. 7, pp. 89-101

13. Prabkhu N. Metody teorii massovogo obsluzhivaniya i upravleniya zapasami [Methods of the queueing and inventory management theory]. Мoscow: Editorial URSS, 1984. 499 p.

14. Pavsky V.A. and Pavsky K.V. Vychislenie momentov sluchaynykh velichin pri erlangovskom vremeni obsluzhivaniya [Calculation of the moments of random variables at Erlang service time]. Materialy 13 mezhdunarodnoy nauchno-prakticheskoy konferentsii «Informatsionnye tekhnologii i matematicheskoe modelirovanie» [Proc. of the 13th International Scientific and Practical Conference "Information Technologies and Mathematical Modeling"]. Tomsk, 2014, part 2. pp. 198-202.

15. Saaty T. Elements of queueing theory with application. New York: Dover publications, 1961. 436 p.

16. Feller W. An introduction to probability theory and its applications, volume 1. New York: Wiley interscience, 1968. 528 p. [Russ. ed.: Feller V. Vvedenie v teoriyu veroyatnostey i ee prilozheniya. Tom 1. Moscow: LIBROKOM Publ., 2010. 528 pp.].

17. Zhang D. and Nastac L. Numerical modeling of the dispersion of ceramic nanoparticles during ultrasonic processing of aluminumbased nanocomposites. Journal of Materials Research and Technology, 2014, no. 3, pp. 296-302.

18. Bernshteyn S.N. Rasprostranenie predel'noy teoremy teorii veroyatnostey na summy zavisimykh velichin [Extension of the limit theorem of probability theory to sums of dependent quantities.] Uspekhi matematicheskikh nauk [Russian Mathematical Surveys], 1944, no. 10, pp. 65-114.

19. Borovkov A.A. Asimptoticheskie metody v teorii massovogo obsluzhivaniya [Asymptotic methods in queuing theory]. Мoscow: Nauka Publ., GRF-ML, 1980. 381 p.

20. Gnedenko V.E. Kurs teorii veroyatnostey [Course of the theory of probability]. Мoscow: LKI Publ., 2007. 448 p.

21. Gnedenko B.V. and Kovalenko I.N. Vvedenie v teoriyu massovogo obsluzhivaniya [Introduction to queuing theory]. Moscow: Editorial URSS Publ., 2005. 400 p.

22. Venttsel' E.S. and Ovcharov L.A. Teoriya sluchaynykh protsessov i ee inzhenernye prilozheniya [The theory of random processes and its engineering applications]. Moscow: High School Publ., 2000. 480 p.

23. Tutubalin V.N. Teoriya veroyatnostey i sluchaynykh protsessov [Theory of Probabilities and Random Processes]. Мoscow: MSU Publ., 1992. 400 p.

24. Sveshchnikov A.G. and Tikhonov A.N. Teoriya funktsii kompleksnogo peremennogo [Theory of a function of a complex variable]. Мoscow: Fizmatlit Publ., 2010. 336 p.

25. Borovkov A.A. Teoriya veroyatnostey [Probability Theory]. Moscow: Editorial URSS Publ., 2009. 652 p.

26. Shiryaev A.N. Veroyatnost' [Probability]. Мoscow: Nauka Publ., GRF-ML, 1989. 640 p.


Войти или Создать
* Забыли пароль?