Integral curve: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Luckas-bot
 
en>Michael Hardy
Line 1: Line 1:
<br><br>I woke up last week and realised - Now I've been solitary for a while and after much intimidation from buddies I now find myself registered for online dating. They assured me that there are lots of pleasant, normal and enjoyable people to meet up, therefore the pitch is gone by here!<br>I try and keep as physically healthy as possible staying at the gym several-times a   [http://minioasis.com luke bryan 2014 tour dates] week. I appreciate my athletics and make an effort to play or watch as many a possible. I'll often at Hawthorn suits being winter. Notice: I've seen the carnage of fumbling suits at stocktake revenue, In case that you will considered purchasing an athletics I don't brain.<br>My household and buddies are amazing and spending some time with them at bar gigs or meals is obviously [http://www.Britannica.com/search?query=critical critical]. I have never been in to nightclubs as I locate that you can do not own a nice dialog together with the noise. In addition, I got 2 undoubtedly  [http://www.banburycrossonline.com luke bryan t] cheeky and very adorable puppies that are constantly ready to meet up new people.<br><br>Feel free to surf to my blog [http://www.museodecarruajes.org is the luke bryan concert sold out]
[[File:markov random field example.png|thumb|right|alt=An example of a Markov random field.| An example of a Markov random field. Each edge represents dependency. In this example: A depends on B and D. B depends on A and D. D depends on A, B, and E. E depends on D and C. C depends on E.]]
 
In the domain of [[physics]] and [[probability]], a '''Markov random field''' (often abbreviated as MRF), '''Markov network''' or '''undirected [[graphical model]]''' is a set of [[random variable]]s having a [[Markov property]] described by an [[undirected graph]].  A Markov random field is similar to a [[Bayesian network]] in its representation of dependencies; the differences being that Bayesian networks are directed and acyclic, whereas Markov networks are undirected and may be cyclic. Thus, a Markov network can represent certain dependencies that a Bayesian network cannot (such as cyclic dependencies); on the other hand, it can't represent certain dependencies that a Bayesian network can (such as induced dependencies).
 
When the probability distribution is strictly positive, it is also referred to as a '''Gibbs random field''', because, according to the [[Hammersley–Clifford theorem]], it can then be represented by a [[Gibbs measure]]. The prototypical Markov random field is the [[Ising model]]; indeed, the Markov random field was introduced as the general setting for the [[Ising model]].<ref>{{cite book
|first1=Ross |last1=Kindermann
|first2=J. Laurie |last2=Snell
|url=http://www.cmap.polytechnique.fr/~rama/ehess/mrfbook.pdf
|title=Markov Random Fields and Their Applications
|year=1980
|publisher=American Mathematical Society
|isbn=0-8218-5001-6
|mr=0620955
}}</ref>
In the domain of [[artificial intelligence]], a Markov random field is used to model various low- to mid-level tasks in [[image processing]] and [[computer vision]].<ref>{{cite book
|first1=S. Z. |last1=Li
|title=Markov Random Field Modeling in Image Analysis
|year=2009
  |publisher=Springer
}}</ref> For example, MRFs are used for [[image restoration]], image completion, [[Segmentation (image processing)|segmentation]], [[image registration]], [[texture synthesis]], [[super-resolution]], [[Computer stereo vision|stereo matching]] and [[Information Retrieval]].
 
== Definition ==
Given an [[undirected graph]] ''G''&nbsp;=&nbsp;(''V'',&nbsp;''E''), a set of [[random variable]]s ''X'' = (''X''<sub>''v''</sub>)<sub>''v''&nbsp;∈&nbsp;''V''</sub> indexed by ''V''&nbsp; form a Markov random field with respect to ''G''&nbsp; if they satisfy the local [[Markov property|Markov properties]]:
 
:'''Pairwise Markov property''': Any two non-adjacent variables are [[conditional independence|conditionally independent]] given all other variables:
 
::<math>X_u \perp\!\!\!\perp X_v \mid X_{V \setminus \{u,v\}} \quad \text{if } \{u,v\} \notin E</math>
 
:'''Local Markov property''': A variable is conditionally independent of all other variables given its neighbors:
 
::<math>X_v \perp\!\!\!\perp X_{V\setminus \operatorname{cl}(v)} \mid X_{\operatorname{ne}(v)}</math>
:where ne(''v'') is the set of neighbors of ''v'', and cl(''v'') = {''v''} ∪ ne(''v'') is the [[Neighborhood (graph theory)|closed neighbourhood]] of ''v''.
 
:'''Global Markov property''': Any two subsets of variables are conditionally independent given a separating subset:
 
::<math>X_A \perp\!\!\!\perp X_B \mid X_S</math>
:where every path from a node in ''A'' to a node in ''B'' passes through ''S''.
 
The above three [[Markov property|Markov properties]] are not equivalent to each other at all. In fact, the Local Markov property is stronger than the Pairwise one, while weaker than the Global one.
 
== Clique factorization ==
As the Markov properties of an arbitrary probability distribution can be difficult to establish, a commonly used class of Markov random fields are those that can be factorized according to the cliques of the graph.
 
Given a set of random variables ''X''&nbsp;=&nbsp;(''X''<sub>''v''</sub>)<sub>''v''&nbsp;∈&nbsp;''V''</sub>, let ''P''(''X''&nbsp;=&nbsp;''x'') be the [[Probability density function|probability]] of a particular field configuration ''x'' in&nbsp;''X''. That is, ''P''(''X''&nbsp;=&nbsp;''x'') is the probability of finding that the random variables ''X'' take on the particular value ''x''.  Because ''X'' is a set, the probability of ''x'' should be understood to be taken with respect to a [[product measure]], and can thus be called a ''joint density''.
 
If this joint density can be factorized over the [[Clique (graph theory)|clique]]s of ''G'':
 
:<math>P(X=x) = \prod_{C \in \operatorname{cl}(G)} \phi_C (x_C) </math>
 
then ''X'' forms a Markov random field with respect to ''G''.  Here, cl(''G'') is the set of cliques of ''G''.  The definition is equivalent if only maximal cliques are used. The functions ''&phi;''<sub>''C''</sub> are sometimes referred to as ''factor potentials'' or ''clique potentials''. Note, however, conflicting terminology is in use: the word ''potential'' is often applied to the logarithm of ''&phi;''<sub>''C''</sub>.  This is because, in [[statistical mechanics]], log(''&phi;''<sub>''C''</sub>) has a direct interpretation as the [[potential energy]] of a [[configuration space|configuration]]&nbsp;''x''<sub>''C''</sub>.
 
Although some MRFs do not factorize (a simple example can be constructed on a cycle of 4 nodes<ref>{{cite journal
|first=John |last=Moussouris
|title=Gibbs and Markov random systems with constraints
|journal=Journal of Statistical Physics
|volume=10 |issue=1 |pages=11&ndash;33 |year=1974
|doi=10.1007/BF01011714 |mr=0432132
}}</ref>), in certain cases they can be shown to be equivalent conditions:
* if the density is positive (by the [[Hammersley–Clifford theorem]]),
* if the graph is [[Chordal graph|chordal]] (by equivalence to a [[Bayesian network]]).
 
When such a factorization does exist, it is possible to construct a [[factor graph]] for the network.
 
== Logistic model ==
Any Markov random field (with a strictly positive density) can be written as [[log-linear model]] with feature functions <math>f_k</math> such that the full-joint distribution can be written as
 
:<math> P(X=x) = \frac{1}{Z} \exp \left( \sum_{k} w_k^{\top} f_k (x_{ \{ k \}}) \right)</math>
where the notation
:<math> w_k^{\top} f_k (x_{ \{ k \}}) = \sum_{i=1}^{N_k} w_{k,i} \cdot f_{k,i}(x_{\{k\}})</math>
is simply a [[dot product]] over field configurations, and ''Z'' is the [[partition function (mathematics)|partition function]]:
 
:<math> Z = \sum_{x \in \mathcal{X}} \exp \left(\sum_{k} w_k^{\top} f_k(x_{ \{ k \} })\right).</math>
 
Here, <math>\mathcal{X}</math> denotes the set of all possible assignments of values to all the network's random variables. Usually, the feature functions <math>f_{k,i}</math> are defined such that they are [[indicator function|indicators]] of the clique's configuration, ''i.e.'' <math>f_{k,i}(x_{\{k\}}) = 1</math> if <math>x_{\{k\}}</math> corresponds to the ''i''-th possible configuration of the ''k''-th clique and 0 otherwise. This model is equivalent to the clique factorization model given above, if <math>N_k=|\operatorname{dom}(C_k)|</math> is the cardinality of the clique, and the weight of a feature <math>f_{k,i}</math> corresponds to the logarithm of the corresponding clique factor, ''i.e.'' <math>w_{k,i} = \log \phi(c_{k,i})</math>, where <math>c_{k,i}</math> is the ''i''-th possible configuration of the ''k''-th clique, ''i.e.'' the ''i''-th value in the domain of the clique <math>C_k</math>.
 
The probability ''P'' is often called the [[Gibbs measure]].  This expression of a Markov field as a logistic model is only possible if all clique factors are non-zero, ''i.e.'' if none of the elements of <math>\mathcal{X}</math> are assigned a probability of 0.  This allows  techniques from matrix algebra to be applied, ''e.g.'' that the [[trace (linear algebra)|trace]] of a matrix is log of the [[determinant]], with the matrix representation of a graph arising from the graph's [[incidence matrix]].
 
The importance of the partition function ''Z'' is that many concepts from [[statistical mechanics]], such as [[entropy]], directly generalize to the case of Markov networks, and an ''intuitive'' understanding can thereby be gained.  In addition, the partition function allows [[variational method]]s to be applied to the solution of the problem: one can attach a driving force to one or more of the random variables, and explore the reaction of the network in response to this [[perturbation theory|perturbation]].  Thus, for example, one may add a driving term ''J''<sub>''v''</sub>, for each vertex ''v'' of the graph, to the partition function to get:
 
:<math> Z[J] = \sum_{x \in \mathcal{X}} \exp \left(\sum_{k} w_k^{\top} f_k(x_{ \{ k \} }) + \sum_v J_v x_v\right)</math>
 
Formally differentiating with respect to ''J''<sub>''v''</sub> gives the [[expectation value]] of the random variable ''X''<sub>''v''</sub> associated with the vertex ''v'':
 
:<math>E[X_v] = \frac{1}{Z} \left.\frac{\partial Z[J]}{\partial J_v}\right|_{J_v=0}.</math>
 
[[Correlation function]]s are computed likewise; the two-point correlation is:
 
:<math>C[X_u, X_v] = \frac{1}{Z} \left.\frac{\partial^2 Z[J]}{\partial J_u \partial J_v}\right|_{J_u=0, J_v=0}.</math>
 
Log-linear models are especially convenient for their interpretation. A log-linear model can provide a much more compact representation for many distributions, especially when variables have large domains. They are convenient too because their negative [[Likelihood function|log likelihoods]] are [[Convex function|convex]]. Unfortunately, though the likelihood of a logistic Markov network is convex, evaluating the likelihood or gradient of the likelihood of a model requires inference in the model, which is in general computationally infeasible.
 
== Examples ==
 
=== Gaussian Markov random field ===
A [[multivariate normal distribution]] forms a Markov random field with respect to a graph ''G''&nbsp;=&nbsp;(''V'',&nbsp;''E'') if the missing edges correspond to zeros on the [[precision matrix]] (the inverse [[covariance matrix]]):
 
:<math>X=(X_v)_{v\in V} \sim \mathcal N (\boldsymbol \mu, \Sigma)
</math>
such that
:<math>(\Sigma^{-1})_{uv} =0 \quad \text{if} \quad \{u,v\} \notin E .</math><ref>{{cite book
|first1=Håvard |last1=Rue |first2=Leonhard |last2=Held
|title=Gaussian Markov random fields: theory and applications
|publisher=CRC Press |year=2005
|isbn=1-58488-432-0
}}</ref>
 
== Inference ==
As in a Bayesian network, one may calculate the [[conditional distribution]] of a set of nodes <math> V' = \{ v_1 ,\ldots, v_i \} </math> given values to another set of nodes <math> W' = \{ w_1 ,\ldots, w_j \} </math> in the Markov random field by summing over all possible assignments to <math>u \notin V',W'</math>; this is called [[exact inference]]. However, exact inference is a [[Sharp-P-complete|#P-complete]] problem, and thus computationally intractable in the general case. Approximation techniques such as [[Markov chain Monte Carlo]] and loopy [[belief propagation]] are often more feasible in practice.  Some particular subclasses of MRFs, such as trees (see [[Chow–Liu tree]]), have polynomial-time inference algorithms; discovering such subclasses is an active research topic. There are also subclasses of MRFs that permit efficient [[Maximum a posteriori|MAP]], or most likely assignment, inference; examples of these include associative networks. Another interesting sub-class is the one of decomposable models (when the graph is [[Chordal graph|chordal]]): having a closed-form for the [[Maximum likelihood estimate|MLE]], it is possible to discover a consistent structure for hundreds of variables.<ref name="Petitjean">{{cite conference |url=http://www.tiny-clues.eu/Research/Petitjean2013-ICDM.pdf |title= Scaling log-linear analysis to high-dimensional data |last1=Petitjean |first1=F. |last2=Webb |first2=G.I. |last3=Nicholson |first3=A.E. |year=2013 |publisher=IEEE |conference=International Conference on Data Mining |location=Dallas, TX, USA }}</ref>
 
== Conditional random fields ==
One notable variant of a Markov random field is a '''[[conditional random field]]''', in which each random variable may also be conditioned upon a set of global observations <math>o</math>. In this model, each function <math>\phi_k</math> is a mapping from all assignments to both the [[Clique (graph theory)|clique]] ''k'' and the observations <math>o</math> to the nonnegative real numbers. This form of the Markov network may be more appropriate for producing [[discriminative model|discriminative classifiers]], which do not model the distribution over the observations.
 
== See also ==
* [[Maximum entropy method]]
* [[Hopfield network]]
* [[Graphical model]]
* [[Markov chain]]
* [[Markov logic network]]
* [[Hammersley–Clifford theorem]]
* [[Interacting particle system]]
* [[Stochastic cellular automata|Probabilistic cellular automata]]
* [[Log-linear analysis]]
 
==References==
{{reflist}}
 
==External links==
* [https://bitbucket.org/rukletsov/b  MRF implementation in C++ for regular 2D lattices]
 
{{Stochastic processes}}
 
[[Category:Graphical models]]
[[Category:Markov networks| ]]
[[Category:Probability theory]]

Revision as of 19:24, 21 May 2013

An example of a Markov random field.
An example of a Markov random field. Each edge represents dependency. In this example: A depends on B and D. B depends on A and D. D depends on A, B, and E. E depends on D and C. C depends on E.

In the domain of physics and probability, a Markov random field (often abbreviated as MRF), Markov network or undirected graphical model is a set of random variables having a Markov property described by an undirected graph. A Markov random field is similar to a Bayesian network in its representation of dependencies; the differences being that Bayesian networks are directed and acyclic, whereas Markov networks are undirected and may be cyclic. Thus, a Markov network can represent certain dependencies that a Bayesian network cannot (such as cyclic dependencies); on the other hand, it can't represent certain dependencies that a Bayesian network can (such as induced dependencies).

When the probability distribution is strictly positive, it is also referred to as a Gibbs random field, because, according to the Hammersley–Clifford theorem, it can then be represented by a Gibbs measure. The prototypical Markov random field is the Ising model; indeed, the Markov random field was introduced as the general setting for the Ising model.[1] In the domain of artificial intelligence, a Markov random field is used to model various low- to mid-level tasks in image processing and computer vision.[2] For example, MRFs are used for image restoration, image completion, segmentation, image registration, texture synthesis, super-resolution, stereo matching and Information Retrieval.

Definition

Given an undirected graph G = (VE), a set of random variables X = (Xv)v ∈ V indexed by V  form a Markov random field with respect to G  if they satisfy the local Markov properties:

Pairwise Markov property: Any two non-adjacent variables are conditionally independent given all other variables:
XuXvXV{u,v}if {u,v}E
Local Markov property: A variable is conditionally independent of all other variables given its neighbors:
XvXVcl(v)X(v)
where ne(v) is the set of neighbors of v, and cl(v) = {v} ∪ ne(v) is the closed neighbourhood of v.
Global Markov property: Any two subsets of variables are conditionally independent given a separating subset:
XAXBXS
where every path from a node in A to a node in B passes through S.

The above three Markov properties are not equivalent to each other at all. In fact, the Local Markov property is stronger than the Pairwise one, while weaker than the Global one.

Clique factorization

As the Markov properties of an arbitrary probability distribution can be difficult to establish, a commonly used class of Markov random fields are those that can be factorized according to the cliques of the graph.

Given a set of random variables X = (Xv)v ∈ V, let P(X = x) be the probability of a particular field configuration x in X. That is, P(X = x) is the probability of finding that the random variables X take on the particular value x. Because X is a set, the probability of x should be understood to be taken with respect to a product measure, and can thus be called a joint density.

If this joint density can be factorized over the cliques of G:

P(X=x)=Ccl(G)ϕC(xC)

then X forms a Markov random field with respect to G. Here, cl(G) is the set of cliques of G. The definition is equivalent if only maximal cliques are used. The functions φC are sometimes referred to as factor potentials or clique potentials. Note, however, conflicting terminology is in use: the word potential is often applied to the logarithm of φC. This is because, in statistical mechanics, log(φC) has a direct interpretation as the potential energy of a configuration xC.

Although some MRFs do not factorize (a simple example can be constructed on a cycle of 4 nodes[3]), in certain cases they can be shown to be equivalent conditions:

When such a factorization does exist, it is possible to construct a factor graph for the network.

Logistic model

Any Markov random field (with a strictly positive density) can be written as log-linear model with feature functions fk such that the full-joint distribution can be written as

P(X=x)=1Zexp(kwkfk(x{k}))

where the notation

wkfk(x{k})=i=1Nkwk,ifk,i(x{k})

is simply a dot product over field configurations, and Z is the partition function:

Z=x𝒳exp(kwkfk(x{k})).

Here, 𝒳 denotes the set of all possible assignments of values to all the network's random variables. Usually, the feature functions fk,i are defined such that they are indicators of the clique's configuration, i.e. fk,i(x{k})=1 if x{k} corresponds to the i-th possible configuration of the k-th clique and 0 otherwise. This model is equivalent to the clique factorization model given above, if Nk=|dom(Ck)| is the cardinality of the clique, and the weight of a feature fk,i corresponds to the logarithm of the corresponding clique factor, i.e. wk,i=logϕ(ck,i), where ck,i is the i-th possible configuration of the k-th clique, i.e. the i-th value in the domain of the clique Ck.

The probability P is often called the Gibbs measure. This expression of a Markov field as a logistic model is only possible if all clique factors are non-zero, i.e. if none of the elements of 𝒳 are assigned a probability of 0. This allows techniques from matrix algebra to be applied, e.g. that the trace of a matrix is log of the determinant, with the matrix representation of a graph arising from the graph's incidence matrix.

The importance of the partition function Z is that many concepts from statistical mechanics, such as entropy, directly generalize to the case of Markov networks, and an intuitive understanding can thereby be gained. In addition, the partition function allows variational methods to be applied to the solution of the problem: one can attach a driving force to one or more of the random variables, and explore the reaction of the network in response to this perturbation. Thus, for example, one may add a driving term Jv, for each vertex v of the graph, to the partition function to get:

Z[J]=x𝒳exp(kwkfk(x{k})+vJvxv)

Formally differentiating with respect to Jv gives the expectation value of the random variable Xv associated with the vertex v:

E[Xv]=1ZZ[J]Jv|Jv=0.

Correlation functions are computed likewise; the two-point correlation is:

C[Xu,Xv]=1Z2Z[J]JuJv|Ju=0,Jv=0.

Log-linear models are especially convenient for their interpretation. A log-linear model can provide a much more compact representation for many distributions, especially when variables have large domains. They are convenient too because their negative log likelihoods are convex. Unfortunately, though the likelihood of a logistic Markov network is convex, evaluating the likelihood or gradient of the likelihood of a model requires inference in the model, which is in general computationally infeasible.

Examples

Gaussian Markov random field

A multivariate normal distribution forms a Markov random field with respect to a graph G = (VE) if the missing edges correspond to zeros on the precision matrix (the inverse covariance matrix):

X=(Xv)vV𝒩(μ,Σ)

such that

(Σ1)uv=0if{u,v}E.[4]

Inference

As in a Bayesian network, one may calculate the conditional distribution of a set of nodes V={v1,,vi} given values to another set of nodes W={w1,,wj} in the Markov random field by summing over all possible assignments to uV,W; this is called exact inference. However, exact inference is a #P-complete problem, and thus computationally intractable in the general case. Approximation techniques such as Markov chain Monte Carlo and loopy belief propagation are often more feasible in practice. Some particular subclasses of MRFs, such as trees (see Chow–Liu tree), have polynomial-time inference algorithms; discovering such subclasses is an active research topic. There are also subclasses of MRFs that permit efficient MAP, or most likely assignment, inference; examples of these include associative networks. Another interesting sub-class is the one of decomposable models (when the graph is chordal): having a closed-form for the MLE, it is possible to discover a consistent structure for hundreds of variables.[5]

Conditional random fields

One notable variant of a Markov random field is a conditional random field, in which each random variable may also be conditioned upon a set of global observations o. In this model, each function ϕk is a mapping from all assignments to both the clique k and the observations o to the nonnegative real numbers. This form of the Markov network may be more appropriate for producing discriminative classifiers, which do not model the distribution over the observations.

See also

References

43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.

External links

Template:Stochastic processes

  1. 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  2. 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  3. One of the biggest reasons investing in a Singapore new launch is an effective things is as a result of it is doable to be lent massive quantities of money at very low interest rates that you should utilize to purchase it. Then, if property values continue to go up, then you'll get a really high return on funding (ROI). Simply make sure you purchase one of the higher properties, reminiscent of the ones at Fernvale the Riverbank or any Singapore landed property Get Earnings by means of Renting

    In its statement, the singapore property listing - website link, government claimed that the majority citizens buying their first residence won't be hurt by the new measures. Some concessions can even be prolonged to chose teams of consumers, similar to married couples with a minimum of one Singaporean partner who are purchasing their second property so long as they intend to promote their first residential property. Lower the LTV limit on housing loans granted by monetary establishments regulated by MAS from 70% to 60% for property purchasers who are individuals with a number of outstanding housing loans on the time of the brand new housing purchase. Singapore Property Measures - 30 August 2010 The most popular seek for the number of bedrooms in Singapore is 4, followed by 2 and three. Lush Acres EC @ Sengkang

    Discover out more about real estate funding in the area, together with info on international funding incentives and property possession. Many Singaporeans have been investing in property across the causeway in recent years, attracted by comparatively low prices. However, those who need to exit their investments quickly are likely to face significant challenges when trying to sell their property – and could finally be stuck with a property they can't sell. Career improvement programmes, in-house valuation, auctions and administrative help, venture advertising and marketing, skilled talks and traisning are continuously planned for the sales associates to help them obtain better outcomes for his or her shoppers while at Knight Frank Singapore. No change Present Rules

    Extending the tax exemption would help. The exemption, which may be as a lot as $2 million per family, covers individuals who negotiate a principal reduction on their existing mortgage, sell their house short (i.e., for lower than the excellent loans), or take part in a foreclosure course of. An extension of theexemption would seem like a common-sense means to assist stabilize the housing market, but the political turmoil around the fiscal-cliff negotiations means widespread sense could not win out. Home Minority Chief Nancy Pelosi (D-Calif.) believes that the mortgage relief provision will be on the table during the grand-cut price talks, in response to communications director Nadeam Elshami. Buying or promoting of blue mild bulbs is unlawful.

    A vendor's stamp duty has been launched on industrial property for the primary time, at rates ranging from 5 per cent to 15 per cent. The Authorities might be trying to reassure the market that they aren't in opposition to foreigners and PRs investing in Singapore's property market. They imposed these measures because of extenuating components available in the market." The sale of new dual-key EC models will even be restricted to multi-generational households only. The models have two separate entrances, permitting grandparents, for example, to dwell separately. The vendor's stamp obligation takes effect right this moment and applies to industrial property and plots which might be offered inside three years of the date of buy. JLL named Best Performing Property Brand for second year running

    The data offered is for normal info purposes only and isn't supposed to be personalised investment or monetary advice. Motley Fool Singapore contributor Stanley Lim would not personal shares in any corporations talked about. Singapore private home costs increased by 1.eight% within the fourth quarter of 2012, up from 0.6% within the earlier quarter. Resale prices of government-built HDB residences which are usually bought by Singaporeans, elevated by 2.5%, quarter on quarter, the quickest acquire in five quarters. And industrial property, prices are actually double the levels of three years ago. No withholding tax in the event you sell your property. All your local information regarding vital HDB policies, condominium launches, land growth, commercial property and more

    There are various methods to go about discovering the precise property. Some local newspapers (together with the Straits Instances ) have categorised property sections and many local property brokers have websites. Now there are some specifics to consider when buying a 'new launch' rental. Intended use of the unit Every sale begins with 10 p.c low cost for finish of season sale; changes to 20 % discount storewide; follows by additional reduction of fiftyand ends with last discount of 70 % or extra. Typically there is even a warehouse sale or transferring out sale with huge mark-down of costs for stock clearance. Deborah Regulation from Expat Realtor shares her property market update, plus prime rental residences and houses at the moment available to lease Esparina EC @ Sengkang
  4. 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  5. 55 years old Systems Administrator Antony from Clarence Creek, really loves learning, PC Software and aerobics. Likes to travel and was inspired after making a journey to Historic Ensemble of the Potala Palace.

    You can view that web-site... ccleaner free download