## SEARCH

#### Country

##### ( see all 443)

- United States 18022 (%)
- China 12737 (%)
- United Kingdom 6787 (%)
- Germany 6670 (%)

#### Institution

##### ( see all 39649)

- Chinese Academy of Sciences 1251 (%)
- University of California 1028 (%)
- Tsinghua University 955 (%)
- Russian Academy of Sciences 822 (%)
- Zhejiang University 813 (%)

#### Author

##### ( see all 189358)

- Gesellschaft für Informatik 188 (%)
- Reimer, Helmut 144 (%)
- Rihaczek, Karl 127 (%)
- Glänzel, Wolfgang 102 (%)
- Wang, Wei 91 (%)

#### Publication

##### ( see all 181)

- Multimedia Tools and Applications 5212 (%)
- Scientometrics 5086 (%)
- Neural Computing and Applications 2960 (%)
- Datenschutz und Datensicherheit - DuD 2789 (%)
- The Journal of Supercomputing 2714 (%)

#### Subject

##### ( see all 171)

- Computer Science [x] 96753 (%)
- Artificial Intelligence (incl. Robotics) 29663 (%)
- Computer Science, general 22004 (%)
- Data Structures, Cryptology and Information Theory 19323 (%)
- Theory of Computation 16049 (%)

## CURRENTLY DISPLAYING:

Most articles

Fewest articles

Showing 1 to 10 of 96753 matching Articles
Results per page:

## 3D-line clipping algorithms — a comparative study

### The Visual Computer (1994-02-01) 11: 96-104 , February 01, 1994

Some well-known line-polyhedron intersection methods are summed up and new accelerating modifications presented. Results of comparison of known and newly developed methods are included. New methods use the fact that each line can be described as the intersection of two planes.

## A note on the variance of rank-based selection strategies for genetic algorithms and genetic programming

### Genetic Programming and Evolvable Machines (2007-09-01) 8: 221-237 , September 01, 2007

This paper evaluates different forms of rank-based selection that are used with genetic algorithms and genetic programming. Many types of rank based selection have exactly the same expected value in terms of the sampling rate allocated to each member of the population. However, the variance associated with that sampling rate can vary depending on how selection is implemented. We examine two forms of tournament selection and compare these to linear rank-based selection using an explicit formula. Because selective pressure has a direct impact on population diversity, we also examine the interaction between selective pressure and different mutation strategies.

## Development of a unified FDTD-FEM library for electromagnetic analysis with CPU and GPU computing

### The Journal of Supercomputing (2013-04-01) 64: 28-37 , April 01, 2013

The present paper describes an optimized C++ library for the study of electromagnetics. The implementation is based on the Finite-Difference Time-Domain method for transient analysis, and the Finite Element Method for electrostatics. Both methods share the same core and are optimized for CPU and GPU computing. To illustrate its running, FEM method is applied for solving Laplace’s equation analyzing the relation between surface curvature and electrostatic potential of a long cylindrical conductor, whereas FDTD is applied for analyzing Thin Film Filters at optical wavelengths. Furthermore, a comparison of the performance of both CPU and GPU versions is analyzed as a function of the grid size simulation. This approach allows the study of a wide range of electromagnetic problems taking advantage of the benefits of each numerical method and the computing power of the modern CPUs and GPUs.

## Proportional fair throughput allocation in multirate IEEE 802.11e wireless LANs

### Wireless Networks (2007-10-01) 13: 649-662 , October 01, 2007

Under heterogeneous radio conditions, Wireless LAN stations may use different modulation schemes, leading to a heterogeneity of bit rates. In such a situation, 802.11 DCF allocates the same throughput to all stations independently of their transmitting bit rate; as a result, the channel is used by low bit rate stations most of the time, and efficiency is low. In this paper, we propose a more efficient throughput allocation criterion based on proportional fairness. We find out that, in a proportional fair allocation, the same share of channel time is given to high and low bit rate stations, and, as a result, high bit rate stations obtain more throughput. We propose two schemes of the upcoming 802.11e standard to achieve this allocation, and compare their delay and throughput performance.

## Semantic image classification using statistical local spatial relations model

### Multimedia Tools and Applications (2008-09-01) 39: 169-188 , September 01, 2008

In this paper, a statistical model called statistical local spatial relations (SLSR) is presented as a novel technique of a learning model with spatial and statistical information for semantic image classification. The model is inspired by probabilistic Latent Semantic Analysis (PLSA) for text mining. In text analysis, PLSA is used to discover topics in a corpus using the bag-of-word document representation. In SLSR, we treat image categories as topics, therefore an image containing instances of multiple categories can be modeled as a mixture of topics. More significantly, SLSR introduces spatial relation information as a factor which is not present in PLSA. SLSR has rotation, scale, translation and affine invariant properties and can solve partial occlusion problems. Using the Dirichlet process and variational Expectation-Maximization learning algorithm, SLSR is developed as an implementation of an image classification algorithm. SLSR uses an unsupervised process which can capture both spatial relations and statistical information simultaneously. The experiments are demonstrated on some standard data sets and show that the SLSR model is a promising model for semantic image classification problems.

## A formal proof of the ε-optimality of absorbing continuous pursuit algorithms using the theory of regular functions

### Applied Intelligence (2014-10-01) 41: 974-985 , October 01, 2014

The most difficult part in the design and analysis of Learning Automata (LA) consists of the formal proofs of their convergence accuracies. The mathematical techniques used for the different families (Fixed Structure, Variable Structure, Discretized etc.) are quite distinct. Among the families of LA, Estimator Algorithms (EAs) are certainly the fastest, and within this family, the set of Pursuit algorithms have been considered to be the pioneering schemes. Informally, if the environment is stationary, their *ε*-optimality is defined as their ability to converge to the optimal action with an arbitrarily large probability, if the learning parameter is sufficiently small/large. The existing proofs of all the reported EAs follow the same fundamental principles, and to clarify this, in the interest of simplicity, we shall concentrate on the family of Pursuit algorithms. Recently, it has been reported Ryan and Omkar (J Appl Probab 49(3):795–805, ) that the previous proofs for *ε*-optimality of all the reported EAs have a common flaw. The flaw lies in the condition which apparently supports the so-called “monotonicity” property of the probability of selecting the optimal action, which states that after some time instant *t*_{0}, the reward probability estimates will be ordered correctly *forever*. The authors of the various proofs have rather offered a proof for the fact that the reward probability estimates are ordered correctly *at a single point of time* after *t*_{0}, which, in turn, does not guarantee the ordering *forever*, rendering the previous proofs incorrect. While in Ryan and Omkar (J Appl Probab 49(3):795–805, ), a rectified proof was presented to prove the *ε*-optimality of the Continuous Pursuit Algorithm (CPA), which was the pioneering EA, in this paper, a new proof is provided for the Absorbing CPA (ACPA), i.e., an algorithm which follows the CPA paradigm but which artificially has absorbing states whenever any action probability is arbitrarily close to unity. Unlike the previous flawed proofs, instead of examining the monotonicity property of the action probabilities, it rather examines their submartingale property, and then, unlike the traditional approach, invokes the theory of Regular functions to prove that the probability of converging to the optimal action can be made arbitrarily close to unity. We believe that the proof is both unique and pioneering, and adds insights into the convergence of different EAs. It can also form the basis for formally demonstrating the *ε*-optimality of other Estimator algorithms which are artificially rendered absorbing.

## Synthesizing trees by plantons

### The Visual Computer (2006-03-31) 22: 238-248 , March 31, 2006

In this paper, we present a two-level statistical model for characterizing the stochastic and specific nature of trees. At the low level, we define *plantons*, which are a group of similar organs, to depict tree organ details statistically. At the high level, a set of transitions between plantons is provided to describe the stochastic distribution of organs.

Based on such a tree model, we propose a novel tree modeling approach, synthesizing trees by plantons, which are extracted from tree samples. All tree samples are captured from the real world. We have designed a maximum likelihood estimation algorithm to acquire the two-level statistical tree model from single samples or multi- samples. Experimental results show that our new model is capable of synthesizing new trees with similar, yet visually different shapes.

## Towards evolvable Internet architecture-design constraints and models analysis

### Science China Information Sciences (2014-11-01) 57: 1-24 , November 01, 2014

There is a general consensus about the success of Internet architecture in academia and industry. However, with the development of diversified application, the existing Internet architecture is facing more and more challenges in scalability, security, mobility and performance. A novel evolvable Internet architecture framework is proposed in this paper to meet the continuous changing application requirements. The basic idea of evolvability is relaxing the constraints that limit the development of the architecture while adhering to the core design principles of the Internet. Three important design constraints used to ensure the construction of the evolvable architecture, including the evolvability constraint, the economic adaptability constraint and the manageability constraint, are comprehensively described. We consider that the evolvable architecture can be developed from the network layer under these design constraints. What’s more, we believe that the address system is the foundation of the Internet. Therefore, we propose a general address platform which provides a more open and efficient network environment for the research and development of the evolvable architecture.

## Approximation with polynomial kernels and SVM classifiers

### Advances in Computational Mathematics (2006-07-01) 25: 323-344 , July 01, 2006

This paper presents an error analysis for classification algorithms generated by regularization schemes with polynomial kernels. Explicit convergence rates are provided for support vector machine (SVM) soft margin classifiers. The misclassification error can be estimated by the sum of sample error and regularization error. The main difficulty for studying algorithms with polynomial kernels is the regularization error which involves deeply the degrees of the kernel polynomials. Here we overcome this difficulty by bounding the reproducing kernel Hilbert space norm of Durrmeyer operators, and estimating the rate of approximation by Durrmeyer operators in a weighted *L*^{1} space (the weight is a probability distribution). Our study shows that the regularization parameter should decrease exponentially fast with the sample size, which is a special feature of polynomial kernels.

## Conferencing on the Internet

### BT Technology Journal (1997-10-01) 15: 51-63 , October 01, 1997

This paper discusses three key requirements for conferencing applications — user on conference location, latency, and scalability. It then looks at the IETF and ITU approaches to meeting these requirements. Finally a protocol called SUCCESS, which attempts to combine the benefits of both approaches, is outlined.