Página 1 dos resultados de 620 itens digitais encontrados em 0.013 segundos

Chaotic encryption method based on life-like cellular automata

Machicao, Jeaneth; Marco, Anderson G.; Bruno, Odemir Martinez
Fonte: PERGAMON-ELSEVIER SCIENCE LTD; OXFORD Publicador: PERGAMON-ELSEVIER SCIENCE LTD; OXFORD
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
27.11%
A chaotic encryption algorithm is proposed based on the "Life-like" cellular automata (CA), which acts as a pseudo-random generator (PRNG). The paper main focus is to use chaos theory to cryptography. Thus, CA was explored to look for this "chaos" property. This way, the manuscript is more concerning on tests like: Lyapunov exponent, Entropy and Hamming distance to measure the chaos in CA, as well as statistic analysis like DIEHARD and ENT suites. Our results achieved higher randomness quality than others ciphers in literature. These results reinforce the supposition of a strong relationship between chaos and the randomness quality. Thus, the "chaos" property of CA is a good reason to be employed in cryptography, furthermore, for its simplicity, low cost of implementation and respectable encryption power. (C) 2012 Elsevier Ltd. All rights reserved.; FAPESP (The State of Sao Paulo Research Foundation, Brazil) [2011/05461-0]; FAPESP (The State of Sao Paulo Research Foundation, Brazil); National Council for Scientific and Technological Development (CNPq), Brazil; CNPq (National Council for Scientific and Technological Development, Brazil) [308449/2010-0, 473893/2010-0]; FAPESP (The State of Sao Paulo Research Foundation); FAPESP (The State of Sao Paulo Research Foundation) [2011/01523-1]

O ensino de estatística e a busca do equilíbrio entre os aspectos determinísticos e aleatórios da realidade; The teaching of statistics and the search for the equilibrium between deterministic and random aspects of reality

Ara, Amilton Braio
Fonte: Biblioteca Digitais de Teses e Dissertações da USP Publicador: Biblioteca Digitais de Teses e Dissertações da USP
Tipo: Tese de Doutorado Formato: application/pdf
Publicado em 26/10/2006 Português
Relevância na Pesquisa
27.11%
Em nossa prática docente no ensino da Estatística para os cursos de engenharia temos constatado a dificuldade dos alunos no entendimento dos conceitos envolvidos nos métodos estatísticos, tendo como conseqüência a falta de motivação para a sua aprendizagem e, em geral, um elevado índice de reprovação. Passamos, então, a refletir sobre as causas dessa dificuldade e os meios de eliminá-las. Verificamos que o caráter problemático do ensino da Estatística decorre de uma equivocada visão da realidade, conseqüência da pouca familiaridade dos alunos com os fenômenos aleatórios que, embora estejam presentes em seu cotidiano, devido ao caráter excessivamente determinista dos currículos escolares, em geral, não são estudados no ensino fundamental e médio. Os objetivos do presente trabalho consistem em: (1) explicitar uma concepção da realidade em que o equilíbrio determinístico/aleatório seja restaurado; (2) repensar o ensino da Probabilidade e da Estatística nos diversos níveis tendo em vista tal equilíbrio e, a partir dele; (3) propor uma nova organização da disciplina Estatística nos cursos de graduação em Engenharia. Fomos buscar no pensamento filosófico e na evolução das idéias da ciência física a concepção predominante sobre os aspectos determinísticos e aleatórios dos fenômenos naturais...

Teorias da Aleatoriedade

Campani, Carlos Antonio Pereira; Menezes, Paulo Fernando Blauth
Tipo: Artigo de Revista Científica Formato: application/pdf
Português
Relevância na Pesquisa
27.11%
Este trabalho apresenta uma revisão bibliográfica sobre a definição de “seqüência aleatória”. Nós enfatizamos a definição de Martin-Löf e a definição baseada em incompressividade (complexidade de Kolmogorov). Complexidade de Kolmogorov é uma teoria sofisticada e profunda da informação e da aleatoriedade baseada na máquina de Turing. Estas duas definições resolvem todos os problemas das outras abordagens e satisfazem o nosso conceito intuitivo de aleatoriedade, sendo matematicamente corretas. Adicionalmente, apresentamos a abordagem de Schnorr que inclui um requisito de efetividade (computabilidade) em sua definição. São apresentadas as relações entre estas diversas definições de forma crítica.; This work is a survey about the definition of “random sequence”. We emphasize the definition of Martin-Löf and the definition based on incompressibility (Kolmogorov complexity). Kolmogorov complexity is a profound and sofisticated theory of information and randomness based on Turing machines. These two definitions solve all the problems of the other approaches, satisfying our intuitive concept of randomness, and both are mathematically correct. Furthermore, we show the Schnorr’s approach, that includes a requisite of effectiveness (computability) in his definition. We show the relations between all definitions in a critical way.

Randomness reuse : extensions and improvements; Coding and Cryptography 2007

Barbosa, Manuel Bernardo; Farshim, P.
Fonte: Springer-Verlag Publicador: Springer-Verlag
Tipo: Artigo de Revista Científica
Publicado em //2007 Português
Relevância na Pesquisa
37.27%
We extend the generic framework of reproducibility for reuse of randomness in multi-recipient encryption schemes as proposed by Bel- lare et al. (PKC 2003). A new notion of weak reproducibility captures not only encryption schemes which are (fully) reproducible under the criteria given in the previous work, but also a class of efficient schemes which can only be used in the single message setting. In particular, we are able to capture the single message schemes suggested by Kurosawa (PKC 2002), which are more efficient than the direct adaptation of the multiple mes- sage schemes studied by Bellare et al. Our study of randomness reuse in key encapsulation mechanisms provides an additional argument for the relevance of these results: by taking advantage of our weak reproducibil- ity notion, we are able to generalise and improve multi-recipient KEM constructions found in literature. We also propose an efficient multi- recipient KEM provably secure in the standard model and conclude the paper by proposing a notion of direct reproducibility which enables tighter security reductions.

A combined test for randomness of spatial distribution of composite microstructures

Scalon,João Domingos
Fonte: Rede Latino-Americana de Materiais Publicador: Rede Latino-Americana de Materiais
Tipo: Artigo de Revista Científica Formato: text/html
Publicado em 01/12/2007 Português
Relevância na Pesquisa
37.27%
A new methodology is presented for characterizing the spatial distribution of second-phase particles in planar sections of multi-phase materials. It is based on the issue of statistically summarizing the results of independent tests against the hypothesis of randomness of the particles. The methodology was applied in multiple planar sections of an aluminium alloy reinforced with silicon carbide particles and leaded to a rejection of the hypothesis of randomness even when the tests from single planar sections were ambiguous.

Randomness and degrees of irregularity.

Pincus, S; Singer, B H
Tipo: text
Publicado em 05/03/1996 Português
Relevância na Pesquisa
27.38%
The fundamental question "Are sequential data random?" arises in myriad contexts, often with severe data length constraints. Furthermore, there is frequently a critical need to delineate nonrandom sequences in terms of closeness to randomness--e.g., to evaluate the efficacy of therapy in medicine. We address both these issues from a computable framework via a quantification of regularity. ApEn (approximate entropy), defining maximal randomness for sequences of arbitrary length, indicating the applicability to sequences as short as N = 5 points. An infinite sequence formulation of randomness is introduced that retains the operational (and computable) features of the finite case. In the infinite sequence setting, we indicate how the "foundational" definition of independence in probability theory, and the definition of normality in number theory, reduce to limit theorems without rates of convergence, from which we utilize ApEn to address rates of convergence (of a deficit from maximal randomness), refining the aforementioned concepts in a computationally essential manner. Representative applications among many are indicated to assess (i) random number generation output; (ii) well-shuffled arrangements; and (iii) (the quality of) bootstrap replicates.

Fluctuations and randomness of movement of the bead powered by a single kinesin molecule in a force-clamped motility assay: Monte Carlo simulations.

Chen, Yi-der; Yan, Bo; Rubin, Robert J
Tipo: text
Publicado em /11/2002 Português
Relevância na Pesquisa
27.11%
The motility assay of K. Visscher, M. J. Schnitzer, and S. M. Block (Nature, 400:184-189, 1999) in which the movement of a bead powered by a single kinesin motor can be measured is a very useful tool in characterizing the force-dependent steps of the mechanochemical cycle of kinesin motors, because in this assay the external force applied to the bead can be controlled (clamped) arbitrarily. However, because the bead is elastically attached to the motor and the response of the clamp is not fast enough to compensate the Brownian motion of the bead, interpretation or analysis of the data obtained from the assay is not trivial. In a recent paper (Y. Chen and B. Yan, Biophys. Chem. 91:79-91, 2001), we showed how to evaluate the mean velocity of the bead and the motor in the motility assay for a given mechanochemical cycle. In this paper we extend the study to the evaluation of the fluctuation or the randomness of the velocity using a Monte Carlo simulation method. Similar to the mean, we found that the randomness of the velocity of the motor is also influenced by the parameters that affect the dynamic behavior of the bead, such as the viscosity of the medium, the size of the bead, the stiffness of the elastic element connecting the bead and the motor...

Quantitative Interpretation of the Randomness in Single Enzyme Turnover Times

Yang, Seongeun; Cao, Jianshu; Silbey, Robert J.; Sung, Jaeyoung
Fonte: The Biophysical Society Publicador: The Biophysical Society
Tipo: text
Publicado em 03/08/2011 Português
Relevância na Pesquisa
27.38%
Fluctuating turnover times of a single enzyme become observable with the advent of modern cutting-edge, single enzyme experimental techniques. Although the conventional chemical kinetics and its modern generalizations could provide a good quantitative description for the mean of the enzymatic turnover times, to our knowledge there has not yet been a successful quantitative interpretation for the variance or the randomness of the enzymatic turnover times. In this review, we briefly review several theories in this field, and compare predictions of these theories to the randomness parameter data reported for ?-galactosidase enzyme. We find the recently proposed kinetics for renewal reaction processes could provide an excellent quantitative interpretation of the randomness parameter data. From the analysis of the randomness parameter data of the single enzyme reaction, one can extract quantitative information about the mean lifetime of enzyme-substrate complex; the success or the failure probability of the catalytic reaction per each formation of ES complex; and the non-Poisson character of the reaction dynamics of the ES complex (which is beyond reach of the long-standing paradigm of the conventional chemical kinetics).

Quantitative comparison of randomization designs in sequential clinical trials based on treatment balance and allocation randomness

Zhao, Wenle; Weng, Yanqiu; Wu, Qi; Palesch, Yuko
Tipo: text
Português
Relevância na Pesquisa
27.38%
To evaluate the performance of randomization designs under various parameter settings and trial sample sizes, and identify optimal designs with respect to both treatment imbalance and allocation randomness, we evaluate 260 design scenarios from 14 randomization designs under 15 sample sizes range from 10 to 300, using three measures for imbalance and three measures for randomness. The maximum absolute imbalance and the correct guess (CG) probability are selected to assess the trade-off performance of each randomization design. As measured by the maximum absolute imbalance and the CG probability, we found that performances of the 14 randomization designs are located in a closed region with the upper boundary (worst case) given by Efron’s biased coin design (BCD) and the lower boundary (best case) from the Soares and Wu’s big stick design (BSD). Designs close to the lower boundary provide a smaller imbalance and a higher randomness than designs close to the upper boundary. Our research suggested that optimization of randomization design is possible based on quantified evaluation of imbalance and randomness. Based on the maximum imbalance and CG probability, the BSD, Chen’s biased coin design with imbalance tolerance method, and Chen’s Ehrenfest urn design perform better than popularly used permuted block design...

Pattern randomness aftereffect

Yamada, Yuki; Kawabe, Takahiro; Miyazaki, Makoto
Fonte: Nature Publishing Group Publicador: Nature Publishing Group
Tipo: text
Publicado em 11/10/2013 Português
Relevância na Pesquisa
27.61%
Humans can easily discriminate a randomly spaced from a regularly spaced visual pattern. Here, we demonstrate that observers can adapt to pattern randomness. Following their adaption to prolonged exposure to two-dimensional patterns with varying levels of physical randomness, observers judged the randomness of the pattern. Perceived randomness decreased (increased) following adaptation to high (low) physical randomness (Experiment 1). Adaptation to 22.5°-rotated adaptor stimuli did not cause a randomness aftereffect (Experiment 2), suggesting that positional variation is unlikely to be responsible for the pattern randomness perception. Moreover, the aftereffect was not selective to contrast polarity (Experiment 3) and was not affected by spatial jitter (Experiment 4). Last, the aftereffect was not affected by adaptor configuration (Experiment 5). Our data were consistent with a model assuming filter-rectify-filter processing for orientation inputs. Thus, we infer that neural processing for orientation grouping/segregation underlies the perception of pattern randomness.

Higher-order dangers and precisely constructed taxa in models of randomness

Pincus, Steve; Singer, Burton H.
Fonte: National Academy of Sciences Publicador: National Academy of Sciences
Tipo: text
Português
Relevância na Pesquisa
27.11%
The validation and construction of individual, putatively “random” infinite sequences have been longstanding problems within mathematics. We address this topic via the study of binary normal numbers, which often have been viewed as models for randomness. We show that normality exhibits a rich, multifactorial taxonomy and is hardly a single monochromatic category. Furthermore, we present a toolkit of algorithmic techniques to explicitly construct normal sequences to achieve diverse yet precisely controlled specifications, many of which (e.g., bias) display unexpected and somewhat pathologic subordinate dynamics. Moreover, we construct a normal number that exhibits pairwise bias toward repeated values and, accordingly, deduce that the evaluation of higher-order block behavior, even beyond equidistribution, is imperative in proper evaluations of “randomness.”

The meaning of spikes from the neuron’s point of view: predictive homeostasis generates the appearance of randomness

Fiorillo, Christopher D.; Kim, Jaekyung K.; Hong, Su Z.
Fonte: Frontiers Media S.A. Publicador: Frontiers Media S.A.
Tipo: text
Publicado em 29/04/2014 Português
Relevância na Pesquisa
27.11%
The conventional interpretation of spikes is from the perspective of an external observer with knowledge of a neuron’s inputs and outputs who is ignorant of the contents of the “black box” that is the neuron. Here we consider a neuron to be an observer and we interpret spikes from the neuron’s perspective. We propose both a descriptive hypothesis based on physics and logic, and a prescriptive hypothesis based on biological optimality. Our descriptive hypothesis is that a neuron’s membrane excitability is “known” and the amplitude of a future excitatory postsynaptic conductance (EPSG) is “unknown”. Therefore excitability is an expectation of EPSG amplitude and a spike is generated only when EPSG amplitude exceeds its expectation (“prediction error”). Our prescriptive hypothesis is that a diversity of synaptic inputs and voltage-regulated ion channels implement “predictive homeostasis”, working to insure that the expectation is accurate. The homeostatic ideal and optimal expectation would be achieved when an EPSP reaches precisely to spike threshold, so that spike output is exquisitely sensitive to small variations in EPSG input. To an external observer who knows neither EPSG amplitude nor membrane excitability...

Temporal Changes in Randomness of Bird Communities across Central Europe

Renner, Swen C.; Gossner, Martin M.; Kahl, Tiemo; Kalko, Elisabeth K. V.; Weisser, Wolfgang W.; Fischer, Markus; Allan, Eric
Fonte: Public Library of Science Publicador: Public Library of Science
Tipo: text
Publicado em 11/11/2014 Português
Relevância na Pesquisa
27.57%
Many studies have examined whether communities are structured by random or deterministic processes, and both are likely to play a role, but relatively few studies have attempted to quantify the degree of randomness in species composition. We quantified, for the first time, the degree of randomness in forest bird communities based on an analysis of spatial autocorrelation in three regions of Germany. The compositional dissimilarity between pairs of forest patches was regressed against the distance between them. We then calculated the y-intercept of the curve, i.e. the ‘nugget’, which represents the compositional dissimilarity at zero spatial distance. We therefore assume, following similar work on plant communities, that this represents the degree of randomness in species composition. We then analysed how the degree of randomness in community composition varied over time and with forest management intensity, which we expected to reduce the importance of random processes by increasing the strength of environmental drivers. We found that a high portion of the bird community composition could be explained by chance (overall mean of 0.63), implying that most of the variation in local bird community composition is driven by stochastic processes. Forest management intensity did not consistently affect the mean degree of randomness in community composition...

Extracting Randomness from Samplable Distributions

Trevisan, Luca; Vadhan, Salil P.
Fonte: IEEE Computer Society Press Publicador: IEEE Computer Society Press
Tipo: conference paper
Português
Relevância na Pesquisa
27.38%
The standard notion of a randomness extractor is a procedure which converts any weak source of randomness into an almost uniform distribution. The conversion necessarily uses a small amount of pure randomness, which can be eliminated by complete enumeration in some, but not all, applications. Here, we consider the problem of deterministically converting a weak source of randomness into an almost uniform distribution. Previously, deterministic extraction procedures were known only for sources satisfying strong independence requirements. In this paper, we look at sources which are samplable, i.e., can be generated by an efficient sampling algorithm. We seek an efficient deterministic procedure that, given a sample from any samplable distribution of sufficiently large min-entropy, gives an almost uniformly distributed output. We explore the conditions under which such deterministic extractors exist. We observe that no deterministic extractor exists if the sampler is allowed to use more computational resources than the extractor. On the other hand, if the extractor is allowed (polynomially) more resources than the sampler, we show that deterministic extraction becomes possible. This is true unconditionally in the nonuniform setting (i.e....

Mistaking Randomness for Free Will

Ebert, Jeffrey Paul; Wegner, Daniel M.
Fonte: Elsevier Publicador: Elsevier
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
37.38%
Belief in free will is widespread. The present research considered one reason why people may believe that actions are freely chosen rather than determined: they attribute randomness in behavior to free will. Experiment 1 found that participants who were prompted to perform a random sequence of actions experienced their behavior as more freely chosen than those who were prompted to perform a deterministic sequence. Likewise, Experiment 2 found that, all else equal, the behavior of animated agents was perceived to be more freely chosen if it consisted of a random sequence of actions than if it consisted of a deterministic sequence; this was true even when the degree of randomness in agents’ behavior was largely a product of their environments. Together, these findings suggest that randomness in behavior—one’s own or another’s—can be mistaken for free will.; Psychology

Differential Privacy with Imperfect Randomness

Dodis, Yevgeniy; López-Alt, Adriana; Mironov, Ilya; Vadhan, Salil P.
Fonte: Springer Verlag Publicador: Springer Verlag
Tipo: conference paper
Português
Relevância na Pesquisa
27.57%
In this work we revisit the question of basing cryptogra- phy on imperfect randomness. Bosley and Dodis (TCC’07) showed that if a source of randomness R is “good enough” to generate a secret key capable of encrypting k bits, then one can deterministically extract nearly k almost uniform bits from R, suggesting that traditional privacy notions (namely, indistinguishability of encryption) requires an “extractable” source of randomness. Other, even stronger impossibility results are known for achieving privacy under specific “non-extractable” sources of randomness, such as the ?-Santha-Vazirani (SV) source, where each next bit has fresh entropy, but is allowed to have a small bias ? < 1 (possibly depending on prior bits). We ask whether similar negative results also hold for a more recent notion of privacy called differential privacy (Dwork et al., TCC’06), concentrating, in particular, on achieving differential privacy with the Santha-Vazirani source. We show that the answer is no. Specifically, we give a differentially private mechanism for approximating arbitrary “low sensitivity” functions that works even with randomness coming from a ?-Santha-Vazirani source, for any ? < 1. This provides a somewhat surprising “separation” between traditional privacy and differential privacy with respect to imperfect randomness. Interestingly...

Randomness Condensers for Efficiently Samplable, Seed-Dependent Sources

Dodis, Yevgeniy; Ristenpart, Thomas; Vadhan, Salil P.
Fonte: Springer Verlag Publicador: Springer Verlag
Tipo: conference paper
Português
Relevância na Pesquisa
27.46%
We initiate a study of randomness condensers for sources that are efficiently samplable but may depend on the seed of the con- denser. That is, we seek functions Cond : {0, 1}n ×{0, 1}d ? {0, 1}m such that if we choose a random seed S ? {0,1}d, and a source X = A(S) is generated by a randomized circuit A of size t such that X has min- entropy at least k given S, then Cond(X;S) should have min-entropy at least some k? given S. The distinction from the standard notion of ran- domness condensers is that the source X may be correlated with the seed S (but is restricted to be efficiently samplable). Randomness extractors of this type (corresponding to the special case where k? = m) have been implicitly studied in the past (by Trevisan and Vadhan, FOCS ‘00). We show that: – Unlike extractors, we can have randomness condensers for samplable, seed-dependent sources whose computational complexity is smaller than the size t of the adversarial sampling algorithm A. Indeed, we show that sufficiently strong collision-resistant hash functions are seed-dependent condensers that produce outputs with min-entropy k? = m ? O(log t), i.e. logarithmic entropy deficiency. – Randomness condensers suffice for key derivation in many crypto- graphic applications: when an adversary has negligible success proba- bility (or negligible “squared advantage” [3]) for a uniformly random key...

Unbalanced expanders and randomness extractors from Parvaresh-Vardy codes

Guruswami, Venkatesan; Umans, Christopher; Vadhan, Salil P.
Fonte: Association for Computing Machinery (ACM) Publicador: Association for Computing Machinery (ACM)
Tipo: Artigo de Revista Científica
Português
Relevância na Pesquisa
37.46%
We give an improved explicit construction of highly unbalanced bipartite expander graphs with expansion arbitrarily close to the degree (which is polylogarithmic in the number of vertices). Both the degree and the number of right-hand vertices are polynomially close to optimal, whereas the previous constructions of Ta-Shma et al. [2007] required at least one of these to be quasipolynomial in the optimal. Our expanders have a short and self-contained description and analysis, based on the ideas underlying the recent list-decodable error-correcting codes of Parvaresh and Vardy [2005]. Our expanders can be interpreted as near-optimal “randomness condensers,” that reduce the task of extracting randomness from sources of arbitrary min-entropy rate to extracting randomness from sources of min-entropy rate arbitrarily close to 1, which is a much easier task. Using this connection, we obtain a new, self-contained construction of randomness extractors that is optimal up to constant factors, while being much simpler than the previous construction of Lu et al. [2003] and improving upon it when the error parameter is small (e.g., 1/poly(n)).; Engineering and Applied Sciences

Distributed computing with imperfect randomness

Vaikuntanathan, Vinod
Fonte: Massachusetts Institute of Technology Publicador: Massachusetts Institute of Technology
Tipo: thesis Formato: 43 p.; 1925681 bytes; 1925680 bytes; application/pdf; application/pdf
Português
Relevância na Pesquisa
27.61%
Randomness is a critical resource in many computational scenarios, enabling solutions where deterministic ones are elusive or even provably impossible. However, the randomized solutions to these tasks assume access to a pure source of unbiased, independent coins. Physical sources of randomness, on the other hand, are rarely unbiased and independent although they do seem to exhibit somewhat imperfect randomness. This gap in modeling questions the relevance of current randomized solutions to computational tasks. Indeed, there has been substantial investigation of this issue in complexity theory in the context of the applications to efficient algorithms and cryptography. This work seeks to determine whether imperfect randomness, modeled appropriately, is "good enough" for distributed algorithms. Namely, can we do with imperfect randomness all that we can do with perfect randomness, and with comparable efficiency ? We answer this question in the affirmative, for the problem of Byzantine agreement. We construct protocols for Byzantine agreement in a variety of scenarios (synchronous or asynchronous networks, with or without private channels), in which the players have imperfect randomness. Our solutions are essentially as efficient as the best known randomized Byzantine agreement protocols...