Research Publications
Enhancements to Language Modeling Techniques for Adaptable Log Message Classification
Fundamental length scale and the bending of light in a gravitational field
Structural Network Metrics and Incident Generation
Research Publications
FEATURED
Enhancements to Language Modeling Techniques for Adaptable Log Message Classification
Fundamental length scale and the bending of light in a gravitational field
Structural Network Metrics and Incident Generation
July, 2022
ABSTRACT
Minimizing the resolution time of service-impacting incidents is a fundamental objective of Information Technology (IT) operations. Efficient root cause analysis, adaptable to diverse service environments, is key to meeting this objective. One method that provides additional insight into an incident, and hence allows enhanced root cause analysis, is categorisation of the events and log messages that characterize an incident into pre-defined operational groups. Well established natural language processing techniques that utilize pre-trained language models and word embeddings can be leveraged for this task. The adaptability of pre-trained models to classify log messages, containing large quantities of domain-specific language, remains unknown. The current contribution investigates multiple ways of addressing this deficiency. We demonstrate increased granularity of word embeddings by using character decompositions and sub-word level representations, and also explore the augmentation of word embeddings using features derived from convolutional operations. After observing that the performance of high-specificity models decreases as the number of previously unseen words increases, we explore the circumstances in which we can use a model trained with a low-specificity corpus to correctly classify log messages. Through the application of fine-tuning techniques, we can adapt our pre-trained classifier to classify log messages from service environments not encountered during pre-training in a time, and memory efficient manner. We conclude that we can effectively adapt pre-trained classifiers for impromptu service environments.
ADDITIONAL LINKS
June, 2022
ABSTRACT
The canonical approach to quantizing quantum gravity is understood to suffer from pathological non-renomalizability. Nevertheless in the context of effective field theory, a viable perturbative approach to calculating elementary processes is possible. Some non-perturbative approaches, most notably loop quantum gravity and combinatorial quantum gravity imply the existence of a minimal length. To circumvent the seeming contradiction between the existence of a minimum length and the principle of special relativity, Double Special Relativity introduces modified dispersion relationships that reconcile the conflict. In this work, we combine these dispersion relationships with an effective field theory approach to compute the first post Newtonian correction to the bending of light by a massive object. The calculation offers the prospect of a directly measurable effect that rests upon both the existence of a quantized gravitational field and a minimal length. Experimental verification would provide evidence of the existence of a quantum theory of gravity, and the fundamental quantization of spacetime with a bound on the minimal distance.
ADDITIONAL LINKS
June, 2022
ABSTRACT
ADDITIONAL LINKS
November, 2021
ABSTRACT
The structure of complex networks has long been understood to play a role in transmission and spreading phenomena on a graph. This behavior is difficult to model analytically and is most often modeled numerically. Such networks form an important part of the structure of society, including transportation networks. As society fights to control the COVID-19 pandemic, an important question is to choose the optimum balance between the full opening of transport networks and the control of epidemic spread. In this paper we investigate how recent advances in analyzing network structure using information theory could inform decisions regarding the opening of such networks. By virtue of the richness of data available we focus upon the worldwide airline network, but these methods are in principle applicable to any transport network. We are able to demonstrate that it is possible to substantially open the airline network and have some degree of control on the spread of the virus.
ADDITIONAL LINKS
ABSTRACT
Using models of emergent geometry are well known to possess ground states with many of the desired features of a low dimensional, Ricci flat vacuum. Further, excitations of these ground states can be shown to replicate the quantum dynamics of a free particle in the continuum limit. It would be a significant next step in the development of emergent Ising models to link them to an underlying physical theory that has General Relativity as its continuum limit. In this work we investigate how the canonical formulation of General Relativity can be used to construct such a discrete Hamiltonian using recent results in discrete differential geometry. We are able to demonstrate that the Ising models of emergent geometry are closely related to the model we propose, which we term the Canonical Ising Model, and may be interpreted as an approximation of discretized canonical general relativity.
ADDITIONAL LINKS
May, 2021
ABSTRACT
Recent advances in emergent geometry have identified a new class of models that represent spacetime as the graph obtained as the ground state of interacting Ising spins. These models have many desirable features, including stable excitations possessing many of the characteristics of a quantum particle. We analyze the dynamics of such excitations, including a detailed treatment of the edge states not previously addressed. Using a minimal prescription for the interaction of defects we numerically investigate approximate bounds to the speed of propagation of such a `particle’. We discover, using numerical simulations, that there may be a Lieb-Robinson bound to propagation that could point the way to how a causal structure could be accommodated in this class of emergent geometry models.
ADDITIONAL LINKS
May, 2021
ABSTRACT
Minimizing the resolution time of service-impacting incidents is a fundamental objective of IT operations. Enriching the meta-data of the events and logs ingested by such systems using AI-based classifiers greatly increases the efficacy of features such as root cause analysis and workflow automation, and hence reduces incident remediation time.
The use of word embeddings in text classification tasks is well-established, however, the general English corpora used to generate off-the-shelf embeddings lack the domain-specific lexicon required for accurate classification of event and log data. In the current contribution, we investigate multiple ways in which this deficiency can be addressed. In addition to augmenting the training-corpus with a domain-specific lexicon, we increase the granularity of our embedding using character ngram decompositions and sub-word level representations.
All implementations improved classification accuracy over the base case. Further, we explore the performance of a sequence classifier with embeddings of varying domain specificity. We observe that the performance of high-specificity models reduces as the volume of previously unseen words in the test data increases.
We conclude that for a multi-input use case, and by leveraging sub-word level information, a high-specificity model can be outperformed by a model trained on a low-specificity corpus.
ADDITIONAL LINKS
May, 2021
ABSTRACT
Recent advances in emergent geometry and discretized approaches to quantum gravity have relied upon the notion of a discrete measure of graph curvature. We focus on the two main measures that have been studied, the so-called Ollivier-Ricci and Forman-Ricci curvatures. These two approaches have a very different origin, and both have advantages and disadvantages.
In this work we study the relationship between the two measures for a class of graphs that are important in quantum gravity applications. We discover that under a specic set of circumstances they are equivalent, opening up the possibility of replacing the more fundamental Ollivier-Ricci curvature by the computationally more accessible Forman-Ricci curvature in certain applications to models of emergent spacetime and quantum gravity.
September, 2020
ABSTRACT
On May 28th and 29th, a two day workshop was held virtually, facilitated by the Beyond Center at ASU and Moogsoft Inc. The aim was to bring together leading scientists with an interest in Network Science and Epidemiology to attempt to inform public policy in response to the COVID-19 pandemic. Epidemics are at their core a process that progresses dynamically upon a network, and are a key area of study in Network Science. In the course of the workshop a wide survey of the state of the subject was conducted. We summarize in this paper a series of perspectives of the subject, and where the authors believe fruitful areas for future research are to be found.
Participants included James Bell, Ginestra Bianconi, David Butler, Jon Crowcroft, Paul C.W Davies, Chris Hicks, Hyunju Kim, Istvan Z. Kiss, Francesco Di Lauro, Carsten Maple, Ayan Paul, Mikhail Prokopenko, Philip Tee, Sara I. Walker.
ADDITIONAL LINKS
August, 2020
ABSTRACT
The idea of a graph theoretical approach to modeling the emergence of a quantized geometry and consequently spacetime, has been proposed previously, but not well studied. In most approaches the focus has been upon how to generate a spacetime that possesses properties that would be desirable at the continuum limit, and the question of howtomodel matter and its dynamics has not been directly addressed.
Recent advances in network science have yielded new approaches to the mechanism by which spacetime can emerge as the ground state of a simple Hamiltonian, based upon a multi-dimensional Ising model with one dimensionless coupling constant. Extensions to this model have been proposed that improve the ground state geometry, but they require additional coupling constants.
In this paper we conduct an extensive exploration of the graph properties of the ground states of these models, and a simplification requiring only one coupling constant. We demonstrate that the simplification is effective at producing an acceptable ground state. Moreover we propose a scheme for the inclusion of matter and dynamics as excitations above the ground state of the simplified Hamiltonian. Intriguingly, enforcing locality has the consequence of reproducing the free non-relativistic dynamics of a quantum particle.
April, 2020
ABSTRACT
Word embeddings provide efficient vector representations of words that capture the syntactic and semantic relationships in a corpus. Within an IT infrastructure, event data can be interpreted as sequences of tokens and can be represented in a continuous vector space. Similar tokens cluster together in the vector space, which can provide insight into patterns of failures and enable detection of actionable incidents.
Fault localization techniques need to be adaptable without the requirement of building knowledge bases from scratch to account for new services or hardware deployed on existing infrastructures, or semantically equivalent incidents described by different lexicons across different infrastructures. Using the paradigm of transfer learning, word embeddings can be built and incrementally updated to introduce new vocabulary and alter the relationships of existing tokens, whilst persisting the general contextual information of the initial embedding.
Features in event data procured from IT infrastructures are typically sparsely distributed, with many events being duplicated with minor character mutations. We use unsupervised clustering techniques to analyse the vector representation of the event data. Our analysis shows that clustering vector representations of event data based on semantic similarity produce interpretable categories, which can be used to improve fault localization and identification of root cause.
ADDITIONAL LINKS
September, 2019
ABSTRACT
To respond rapidly and accurately to network and service outages, network operators must deal with a large number of events resulting from the interaction of various services operating on complex, heterogeneous and evolving networks. In this paper, we introduce the concept of functional connectivity as an alternative approach to monitoring those events. Commonly used in the study of brain dynamics, functional connectivity is defined in terms of the presence of statistical dependencies between nodes. Although a number of techniques exist to infer functional connectivity in brain networks, their straightforward application to commercial network deployments is severely challenged by: (a) non-stationarity of the functional connectivity, (b) sparsity of the time-series of events, and (c) absence of an explicit model describing how events propagate through the network or indeed whether they propagate. Thus, in this paper, we present a novel inference approach whereby two nodes are defined as forming a functional edge if they emit substantially more coincident or short-lagged events than would be expected if they were statistically independent. The output of the method is an undirected weighted graph, where the weight of an edge between two nodes denotes the strength of the statistical dependence between them. We develop a model of time-varying functional connectivity whose parameters are determined by maximising the model’s predictive power from one time window to the next. We assess the accuracy, efficiency and scalability of our method on two real datasets of network events spanning multiple months and on synthetic data for which ground truth is available. We compare our method against both a general-purpose time-varying network inference method and network management specific causal inference technique and discuss its merits in terms of sensitivity, accuracy and, importantly, scalability.
July, 2019
May, 2019
April, 2019
March, 2019
April, 2019
April, 2019
ABSTRACT
In this paper we present a novel approach for inferring functional connectivity within a large-scale network from time series of emitted node events. We do so under the following constraints: (a) non-stationarity of the underlying connectivity, (b) sparsity of the time-series of events, and (c) absence of an explicit model describing how events propagate through the network. We develop an inference method whose output is an undirected weighted network, where the weight of an edge between two nodes denotes the probability of these nodes being functionally connected. Two nodes are assumed to be functionally connected if they show significantly more coincident or short-lagged events than randomly picked pairs of nodes with similar levels of activity. We develop a model of time-varying connectivity whose parameters are determined by maximising the model’s predictive power from one time window to the next. We assess the accuracy, efficiency and scalability of our method on a real dataset of network events spanning multiple months.
ADDITIONAL LINKS
ABSTRACT
A consistent quantum theory of gravity has remained elusive ever since the emergence of General Relativity and Quantum Field Theory. Attempts to date have not yielded a candidate that is either free from problematic theoretical inconsistencies, falsifiable by experiment, or both. At the heart of all approaches though the difficult question of what it means for spacetime itself to be quantized, and how that can affect physics, has not been addressed. In recent years a number of proposals have been made to address the quantum structure of spacetime, and in particular how geometry and locality can emerge as the Universe cools.
Quantum Graphity is perhaps the best known of these, but still does not connect the emerged quantized spacetime to dynamics or gravity. In this paper we start from a quantized mesh as the pre-geometry of space time and identify that informationally and in a very natural sense, the natural laws of gravity and Newtonian dynamics emerge. The resultant equations of gravity have a Yukawa term that operates at cosmic scale (1018 meters), and we use data from the Spitzer space telescope to investigate experimental agreement of the galactic rotation curves with encouraging results. We conclude by discussing how this pre-geometry could result in the classical covariant constructs of General Relativity in the low energy continuum limit.
ADDITIONAL LINKS
April, 2019
ABSTRACT
In recent years, the use of data-driven algorithms has gained significant traction as a method of localizing faults in commercial networks through the analysis of network events. While the number of different failure scenarios in any IT infrastructure is potentially unbounded, the fundamental mechanisms of fault propagation and related failure modes are much smaller. A significant component of many failure modes is a statistically predictable time pattern of event production.
In this paper we describe a novel approach for identifying faults that produce such a sequence of events, based only upon their temporal arrival pattern. The approach uses state-of-the-art optimization techniques to establish a temporal similarity graph for a given group of alerts. Community detection algorithms are then applied to the event similarity graph in order to determine groups of faults with similar arrival patterns.
We demonstrate the efficacy of the new approach by applying it to simulated event streams in realistic failure scenarios, reproducing results that are consistent with experience of deploying the technique in real world networks.
ADDITIONAL LINKS
July, 2019
ABSTRACT
Recent work on the study of cell populations in mouse tumors has revealed much about the clonal evolution of cancers from the initiated cell to metastasis. Although most cancers are clonal in origin, genetic instability leads to the emergence of new cell clones, some of which show cooperative behavior during progression to metastasis. The nature of these cell-cell interactions is unclear, and in particular it is possible that their spatial distribution could influence the emergence of fully malignant behavior. The spatial distribution would indicate a subtle dependence of the distribution of these cells and the emergence of malignancy.
In this paper we model tumor evolution using dynamically evolving spatially embedded random graphs. The dynamic evolution of Spatio-Temporal graphs is not widely studied analytically, particularly when distance based preferences are included. In this paper we present analysis and simulations of such graphs, and demonstrate that the distance function relative to the mixing of the nodes can combine to create phase transitions in connectivity. This result supports the hypothesis that cell to cell interaction is a critical feature of malignancy in tumors.
ADDITIONAL LINKS
December, 2018
ABSTRACT
In this paper, we present a detailed framework to analyze the evolution of the random topology of a time-varying wireless network via the information theoretic notion of entropy rate. We consider a propagation channel varying over time with random node positions in a closed space and Rayleigh fading affecting the connections between nodes. The existence of an edge between two nodes at given locations is modeled by a Markov chain, enabling memory effects in network dynamics. We then derive a lower and an upper bound on the entropy rate of the spatiotemporal network. The entropy rate measures the shortest per-step description of the stationary stochastic process defining the state of the wireless system and depends both on the maximum Doppler shift and the path loss exponent. It characterizes the topological uncertainty of the wireless network and quantifies how quickly the underlying topology is varying with time.
ADDITIONAL LINKS
November, 2018
ABSTRACT
The performance of mobile ad hoc networks in general and that of the routing algorithm, in particular, can be heavily affected by the intrinsic dynamic nature of the underlying topology. In this paper, we build a new analytical/numerical framework that characterizes nodes’ mobility and the evolution of links between them. This formulation is based on a stationary Markov chain representation of link connectivity. The existence of a link between two nodes depends on their distance, which is governed by the mobility model. In our analysis, nodes move randomly according to an Ornstein-Uhlenbeck process using one tuning parameter to obtain different levels of randomness in the mobility pattern. Finally, we propose an entropy-rate-based metric that quantifies link uncertainty and evaluates its stability. Numerical results show that the proposed approach can accurately reflect the random mobility in the network and fully captures the link dynamics. It may thus be considered a valuable performance metric for the evaluation of the link stability and connectivity in these networks.
ADDITIONAL LINKS
September, 2018
ABSTRACT
Many complex systems show spatiotemporal characteristics in the real world. These networks are composed of a large number of nodes embedded in space and a set of edges linking nodes together, dynamically evolving over time. In this work, we present a detailed framework to analyze the evolution of the random topology of a time-varying wireless network via the information theoretic notion of entropy rate. We derive a lower and an upper bound on the entropy rate of the spatiotemporal network. The entropy rate measures the shortest per-step description of the stationary stochastic process defining the state of the wireless system and depends both on the maximum Doppler shift and the path loss exponent. It characterizes the topological uncertainty of the wireless network and quantifies how quickly the underlying topology is varying with time.
September, 2018
December, 2018
ABSTRACT
Random Geometric or Spatial Graphs, are well studied models of networks where spatial embedding is an important consideration. However, the dynamic evolution of such spatial graphs is less well studied, at least analytically. Indeed when distance preference is included the principal studies have largely been simulations. An important class of spatial networks has application in the modeling of cell symbiosis in certain tumors, and, when modeled as a graph naturally introduces a distance preference characteristic of the range of cell to cell interaction.
In this paper we present theoretical analysis, and, experimental simulations of such graphs, demonstrating that distance functions that model the mixing of the cells, can create phase transitions in connectivity, and thus cellular interactions. This is an important result that could provide analytical tools to model the transition of tumors from benign to malignant states, as well as a novel class of spatial network evolution.
ADDITIONAL LINKS
May, 2018
ABSTRACT
Combinatoric measures of entropy capture the complexity of a graph but rely upon the calculation of its independent sets, or collections of non-adjacent vertices. This decomposition of the vertex set is a known NP-Complete problem and for most real world graphs is an inaccessible calculation. Recent work by Dehmer et al. and Tee et al. identified a number of vertex level measures that do not suffer from this pathological computational complexity, but that can be shown to be effective at quantifying graph complexity. In this paper, we consider whether these local measures are fundamentally equivalent to global entropy measures.
Specifically, we investigate the existence of a correlation between vertex level and global measures of entropy for a narrow subset of random graphs. We use the greedy algorithm approximation for calculating the chromatic information and therefore Körner entropy. We are able to demonstrate strong correlation for this subset of graphs and outline how this may arise theoretically.
ADDITIONAL LINKS
September, 2018
May, 2018
October, 2018
April, 2018
ABSTRACT
ISP and commercial networks are complex and thus difficult to characterise and manage. Network operators rely on a continuous flow of event log messages to identify and handle service outages. However, there is little published information about such events and how they are typically exploited. In this paper, we describe in as much detail as possible the event logs and network topology of a major commercial network. Through analysing the network topology, textual information of events and time of events, we highlight opportunities and challenges brought by such data. In particular, we suggest that the development of methods for inferring functional connectivity could unlock more of the informational value of event log messages and assist network management operators.
ADDITIONAL LINKS
April, 2018
ABSTRACT
Commercial applications of fault localization typically utilize static models of the underlying system to identify root causes amongst the many monitored events. Fundamentally, the limitation of this approach arises from the practical challenges in building and maintaining this model. More recently, much attention has been paid to the use of data-driven algorithms as an alternative for identifying anomalous clusters of events and deducing the existence of a localized fault, as described by these events.
In this paper we describe the characteristics of one approach that relies upon clustering of alerts by the similarity vector of configurable attributes of the alert. Using a simulation of a real world commercial application, we investigate the stability of this approach in the dynamic environments that characterize modern infrastructure. We are able to demonstrate that it is far superior to rules-based approaches.
ADDITIONAL LINKS
June, 2017
July, 2017
ABSTRACT
Understanding which node failures in a network have more impact is an important problem. Current understanding, motivated by the scale free models of network growth, places emphasis on the degree of the node. This is not a satisfactory measure; the number of connections a node has does not capture how redundantly it is connected into the whole network. Conversely, the structural entropy of a graph captures the resilience of a network well, but is expensive to compute, and, being a global measure, does not attribute any specific value to a given node. This lack of locality prevents the use of global measures as a way of identifying critical nodes.
In this paper, we introduce local vertex measures of entropy which do not suffer from such drawbacks. In our theoretical analysis, we establish the possibility that our local vertex measures approximate global entropy, with the advantage of locality and ease of computation. We establish properties that vertex entropy must have in order to be useful for identifying critical nodes. We have access to a proprietary event, topology, and incident dataset from a large commercial network. Using this dataset, we demonstrate a strong correlation between vertex entropy and incident generation over events.
ADDITIONAL LINKS
May, 2017
ABSTRACT
The principal objective when monitoring compute and communications infrastructure is to minimize the Mean Time To Resolution of service-impacting incidents. Key to achieving that goal is determining which of the many alerts that are presented to an operator are likely to be the root cause of an incident. In turn this is critical in identifying which alerts should be investigated with the highest priority.
Noise reduction techniques can be employed to reduce the quantity of alerts a network operator needs to examine but even in favorable scenarios there may be multiple candidate alerts that need to be investigated before the root cause of the incident can be accurately identified, resolved and full service resumed. The current contribution describes a novel technique, Probable Root Cause, that applies supervised machine learning in the form of Neural Networks to determine the alerts most likely to be responsible for a service-impacting incident. An evaluation of different models and model parameters is presented. The effectiveness of the approach is demonstrated against sample data from a large commercial environment.
ADDITIONAL LINKS
May, 2017
ABSTRACT
We study the effect of anisotropic radiation on wireless network complexity. To this end, we model a wireless network as a random geometric graph where nodes have random antenna orientations as well as random positions, and communication is affected by Rayleigh fading. Complexity is quantified by computing the Shannon entropy of the underlying graph model. We use this formalism to develop analytic scaling results that describe how complexity can be controlled by varying key system parameters such as the transmit power and the directivity of transmissions in large-scale networks. Our results point to striking contrasts between power scaling and directivity scaling in the large connection range regime.
ADDITIONAL LINKS
September, 2017
ABSTRACT
Barab ́asi–Albert’s “Scale Free” model is the starting point for much of the accepted theory of the evolution of real world communication networks. Careful comparison of the theory with a wide range of real world networks, however, indicates that the model is in some cases, only a rough approximation to the dynamical evolution of real networks. In particular, the exponent γ of the power law distribution of degree is predicted by the model to be exactly 3, whereas in a number of real world networks it has values between 1.2 and 2.9.
In addition, the degree distributions of real networks exhibit cut offs at high node degree, which indicates the existence of maximal node degrees for these networks. In this paper we propose a simple extension to the “Scale Free” model, which offers better agreement with the experimental data. This improvement is satisfying, but the model still does not explain why the attachment probabilities should favor high degree nodes, or indeed how constraints arrive in non-physical networks.
Using recent advances in the analysis of the entropy of graphs at the node level we propose a first principles derivation for the “Scale Free” and “constraints” model from thermodynamic principles, and demonstrate that both preferential attachment and constraints could arise as a natural consequence of the second law of thermodynamics.
ADDITIONAL LINKS
April, 2016
ABSTRACT
A key objective of monitoring networks is to identify potential service threatening outages from events within the network before service is interrupted. Identifying causal events, Root Cause Analysis (RCA), is an active area of research, but current approaches are vulnerable to scaling issues with high event rates. Elimination of noisy events that are not causal is key to ensuring the scalability of RCA. In this paper, we introduce vertex-level measures inspired by Graph Entropy and propose their suitability as a categorization metric to identify nodes that are a priori of more interest as a source of events.
We consider a class of measures based on Structural, Chromatic and Von Neumann Entropy. These measures require NP-Hard calculations over the whole graph, an approach which obviously does not scale for large dynamic graphs that characterise modern networks. In this work we identify and justify a local measure of vertex graph entropy, which behaves in a similar fashion to global measures of entropy when summed across the whole graph. We show that such measures are correlated with nodes that generate incidents across a network from a real data set.
ADDITIONAL LINKS
Enhancements to Language Modeling Techniques for Adaptable Log Message Classification
Robert Harper, Yusufu ShehuABSTRACT
Minimizing the resolution time of service-impacting incidents is a fundamental objective of Information Technology (IT) operations. Efficient root cause analysis, adaptable to diverse service environments, is key to meeting this objective. One method that provides additional insight into an incident, and hence allows enhanced root cause analysis, is categorisation of the events and log messages that characterize an incident into pre-defined operational groups. Well established natural language processing techniques that utilize pre-trained language models and word embeddings can be leveraged for this task. The adaptability of pre-trained models to classify log messages, containing large quantities of domain-specific language, remains unknown. The current contribution investigates multiple ways of addressing this deficiency. We demonstrate increased granularity of word embeddings by using character decompositions and sub-word level representations, and also explore the augmentation of word embeddings using features derived from convolutional operations. After observing that the performance of high-specificity models decreases as the number of previously unseen words increases, we explore the circumstances in which we can use a model trained with a low-specificity corpus to correctly classify log messages. Through the application of fine-tuning techniques, we can adapt our pre-trained classifier to classify log messages from service environments not encountered during pre-training in a time, and memory efficient manner. We conclude that we can effectively adapt pre-trained classifiers for impromptu service environments.
ADDITIONAL LINKS
Fundamental length scale and the bending of light in a gravitational field
Phil Tee, Nosratollah JafariABSTRACT
The canonical approach to quantizing quantum gravity is understood to suffer from pathological non-renomalizability. Nevertheless in the context of effective field theory, a viable perturbative approach to calculating elementary processes is possible. Some non-perturbative approaches, most notably loop quantum gravity and combinatorial quantum gravity imply the existence of a minimal length. To circumvent the seeming contradiction between the existence of a minimum length and the principle of special relativity, Double Special Relativity introduces modified dispersion relationships that reconcile the conflict. In this work, we combine these dispersion relationships with an effective field theory approach to compute the first post Newtonian correction to the bending of light by a massive object. The calculation offers the prospect of a directly measurable effect that rests upon both the existence of a quantized gravitational field and a minimal length. Experimental verification would provide evidence of the existence of a quantum theory of gravity, and the fundamental quantization of spacetime with a bound on the minimal distance.
ADDITIONAL LINKS
Structural Network Metrics and Incident Generation
Robert Harper, Phil TeeABSTRACT
ADDITIONAL LINKS